cover photo

COURSEWORK

Sujay's CL-CY-001 course work. Lv 2

Sujay Vikram G SAUTHORACTIVE

level 2

14 / 9 / 2023


Task 1 Understanding the OSI Architecture and Network Fundamentals

I recently embarked on the task of delving into the intricacies of the OSI (Open Systems Interconnection) architecture and exploring fundamental network concepts, including protocols, switching, routing, handshakes, and IP addressing. Here is a summary of my journey:

OSI Architecture Exploration

The OSI architecture, comprised of multiple layers, plays a pivotal role in understanding how network communication functions efficiently. I gathered valuable insights from various sources, which include:

  1. GeeksforGeeks provided a comprehensive overview of the layers of the OSI model, which include the application, transport, and network layers.

  2. Imperva deepened my understanding of application security within the context of the OSI model.

Protocols and Handshakes

Protocols are the language of network communication, and understanding them is crucial. I consulted Baeldung to explore handshakes, which are essential in establishing connections between devices. This helped me grasp the process of initiating and terminating communication.

Networking Essentials

To gain a holistic understanding of computer networks, I turned to TutorialsPoint to explore routers and their significance in network architecture. Routers are pivotal in directing data packets efficiently.

IP Addressing

IP addressing is a fundamental concept in networking. I researched various sources to comprehend IP addressing schemes and their role in routing data across networks. This knowledge is crucial for effective network management and troubleshooting.

In conclusion, my exploration of the OSI architecture and network fundamentals has broadened my understanding of how data flows through computer networks. This knowledge is invaluable for anyone interested in network administration, cybersecurity, or software development involving network communication.

.............................................................

task 2 Creating a Serverless Node.js Application - My Experience

To create a server that supports running serverless Node.js code, I chose to use Google Cloud Functions. These cloud services provide a seamless environment for running code without the need to manage servers, which is the essence of serverless computing. The code execution is triggered by various events, such as HTTP requests, database updates, or file uploads.

One of the key challenges in serverless computing is dealing with cold starts. These happen when a function is invoked for the first time or after a period of inactivity. During a cold start, the cloud provider must allocate the necessary resources to execute the function, which introduces latency. To mitigate this issue, I opted to keep the function warm by invoking it periodically. This ensures that it's always ready to handle incoming requests. I also explored third-party tools like AWS Lambda Warmer and Serverless Plugin Warmup to address this problem more efficiently[^1^].

Another crucial aspect of managing a serverless application is handling its lifecycle. It's essential to terminate the application when it's not in use to avoid unnecessary costs. This can be achieved by setting a timeout value for the function or using third-party tools such as AWS Lambda Powertools[^1^].

Steps Taken

Here are the steps I followed to create a serverless Node.js application:

  1. Creation of a Function: I started by creating a new function within the Google Cloud console.

  2. Runtime Environment: I selected Node.js as the runtime environment for my function.

  3. Code Implementation: Next, I wrote the necessary code for my application and uploaded it to the function.

  4. Trigger Configuration: I configured the function's trigger, which, in my case, was an HTTP request.

  5. Testing: Finally, I thoroughly tested the function to ensure it was working as expected.

In conclusion, creating a serverless Node.js application involves these fundamental steps, along with addressing challenges like cold starts and lifecycle management. By ensuring our functions are kept warm and efficiently managing their lifecycles, we can guarantee cost-effective and performant serverless applications.

.............................................................

task 3 Chat Application Using Socket.io - Project Report

Introduction

The Chat Application using Socket.io is a real-time messaging platform that employs WebSocket technology for instant message delivery. This project utilizes Node.js, Express, Mongoose (for database connectivity), and Socket.io for real-time communication.

Technology Stack

  • Server-Side: Node.js, Express.js, Mongoose for MongoDB interaction, and Socket.io for WebSocket implementation.

  • Client-Side: HTML, CSS, JavaScript, and Socket.io Client library for WebSocket communication.

Project Overview

The chat application allows users to:

  • Sign up, log in, and log out securely.
  • Send and receive messages in real-time without page refresh.
  • Maintain message history for reviewing past conversations.
  • Check the online/offline status of contacts in real-time.
  • Send direct messages to specific individuals.

Conclusion

The Chat Application using Socket.io demonstrates the practicality of WebSocket technology for real-time messaging. It showcases the capabilities of Node.js and Socket.io for building responsive, real-time applications.

Future Improvements

  1. Group Chat: Implement group chat functionality.

  2. Notifications: Add push notifications for new messages.

  3. File Sharing: Allow users to send and receive files within chats.

  4. User Profiles: Create user profiles with profile pictures and additional information.

References

This project effectively uses WebSocket (Socket.io) for real-time communication in a chat application.

.............................................................

Task 4 - Make a Web app

A web app is a software application accessed via web browsers, eliminating the need for installation. Using Firebase, I built a web app in HTML, CSS, and JavaScript. Firebase provided real-time database and authentication services, simplifying user management and data storage. This allowed for dynamic, responsive, and collaborative web app development.

.............................................................

Task 5 - Docker

Introduction

This report outlines the process of setting up automated Docker container builds triggered by pushes to a production branch in a version control system (VCS).

Steps to Automate Docker Builds

  1. Select a CI/CD Tool: Choose a CI/CD tool that suits your project's needs and integrates with your VCS.

  2. Configure CI Pipeline: Set up a CI pipeline within your chosen tool and configure it to monitor changes in the production branch.

  3. Build Docker Image: In the CI pipeline, create a step that builds the Docker image using a Dockerfile.

  4. Push Docker Image: After building the image, push it to a Docker registry for storage and distribution.

  5. Trigger on Push: Ensure that the CI pipeline is set to automatically trigger whenever a push is made to the production branch in your VCS.

Benefits

Automating Docker container builds offers several benefits:

  • Ensures consistency in the build process.
  • Facilitates collaboration among development teams.
  • Reduces the risk of manual errors in image creation.
  • Speeds up the deployment process.

.............................................................

Task 6 - Docker file spyware

In this setup:

The Dockerfile sets up an Ubuntu-based container, installs the necessary packages (inotify-tools and rsync), and sets environment variables for the source folder and remote server details.

The transfer-images.sh script monitors the specified source folder for new file creation events using inotifywait. When a new image is detected, it uses rsync to transfer the image to the remote server.

Build the Docker container with docker build -t image-transfer-container ., replacing placeholders with actual folder paths and remote server details.

Run the Docker container with docker run to start the monitoring and image transfer process.

.............................................................

Task 7 - Webscraping and Automation

Beautiful Soup is used for web scraping and parsing HTML or XML documents, making it easy to extract specific data elements from static web pages. Selenium, on the other hand, is designed for web automation tasks, enabling control over web browsers, interaction with dynamic content, and automated testing of web applications. Beautiful Soup is ideal for extracting data, while Selenium excels at automating user interactions and testing web apps.

"

UVCE,
K. R Circle,
Bengaluru 01