cover photo

BLOG · 14/9/2023

Task 6: All about neural networks!

Due to space considerations, here is the blog:

Vrushank R Rao (0344)
Vrushank R Rao (0344)
OP
Task 6: All about neural networks!
This Article is yet to be approved by a Coordinator.

Task 6: \n1. Write a blog about your understanding of Neural Networks and types like CNN, AN \netc. Make sure to include any mathematical implication. You can add the function calls\nused to implement the algorithms.\n2. Learn about Large Language Models at a basic level and make a blog post explaining\nhow you would build GPT-4.\n\n***\n\nThey are the basis of deep learning, a subset of ML where algorithms are inspired by the neural connections in the brain, hence the name. Just like any ML algorithm, they take in data as input, train themselves to recognize the patterns and predict the output for new set of data (Like train-test-split).\n\nThe neural network has a system of interconnected layers via which information is processed and passed.\n\nJust like brain, the basic unit of computation here is neuron as well.\nA bunch of neurons make up a layer.\n\nThe different bunch of layers are:\n\n Input layer - just relays the information in input to the next layer.\n\n Hidden layer - Here the processing happens and the output of this is transferred to the next layer.\n\n Output layer: An activation function is used to map to the desired output using different softwares for classifying this.\n\n Connections and weights: The network consists of connections, each connection transferring the output of a neuron i to the input of a neuron j.\n \n Learning rule: The learning rule is a rule or an algorithm which modifies the parameters of the neural network, in order for a given input to the network to produce a favored output.\n \nTypes of Neural Networks:\n\n 1. Feedforward Neural Networks: In this network, the information moves in only one direction, forward, from the input nodes, through the hidden nodes (if any) and to the output nodes. There are no cycles or loops in the network.\n \n 2. Single-layer Perceptron:\n This is the simplest feedforward neural Network and does not contain any hidden layer, Which means it only consists of a single layer of output nodes.\n \n 3. Multi-layer perceptron (MLP):\n This class of networks consists of multiple layers of computational units, usually interconnected in a feed-forward way. Each neuron in one layer has directed connections to the neurons of the subsequent layer\n\n 4. Convolutional Neural Network (CNN):\n Unlike any other Artificial Neural Network, it has an aboility to detect patterns. It has convolutional layers which receives, transforms and outputs as any other layer, but has filters that detect patterns (edges, corners, circles, etc.) - It performs an operation called convolving\n\nBlock diagram of CNN:\n\n\n\n Convolving - Merging of 2 sets of information.\nStep I:\n\n\nStep II:\n\n\n The matrix of the dot product is passed to next layer and keeps on going till the output layer.\n\n Meanwhile, there is pooling layer which compressed the output of convolution layer

UVCE,
K. R Circle,
Bengaluru 01