cover photo

BLOG · 3/6/2025

Neural Networks

Keerthi S
Keerthi S
OP
Neural Networks
This Article is yet to be approved by a Coordinator.

Understanding Neural Networks: A Simple Guide to ANN, CNN & RNN

Neural networks power many AI applications—from recognizing images to translating languages. This guide explains the basics of neural networks and dives deeper into three main types: Artificial Neural Networks (ANN), Convolutional Neural Networks (CNN), and Recurrent Neural Networks (RNN). It also covers basic math concepts and shows a sample Python code.


Introduction to Neural Networks

Neural networks are inspired by the human brain’s structure. They consist of layers of artificial neurons connected by weights. These networks learn from data by adjusting these weights to recognize patterns and make predictions.


Neural Network Structure

A neural network typically has:

  • Input layer: Takes raw data.
  • Hidden layers: Process the data through neurons.
  • Output layer: Produces the final prediction or classification.

Each connection between neurons has a weight, which changes during learning.


Basic Math Behind Neural Networks

Each neuron calculates a weighted sum of inputs:

$$ z = \sum w_i x_i + b $$

Then applies an activation function to add non-linearity:

  • Sigmoid:

$$ \sigma(z) = \frac{1}{1 + e^{-z}} $$

  • ReLU:

$$ f(z) = \max(0, z) $$

These help the network model complex data patterns.


Artificial Neural Networks (ANN)

ANNs are the fundamental type of neural network. They are made of fully connected layers where every neuron connects to every neuron in the next layer. ANNs work well on many problems like classifying emails, predicting prices, or identifying fraud. They learn by comparing predictions with actual results and adjusting weights through a process called backpropagation.


Convolutional Neural Networks (CNN)

CNNs are specialized for processing images. Instead of connecting every neuron to every other neuron, CNNs use filters (small matrices) that slide over the image to detect local features like edges, colors, or textures. This operation is called convolution. CNNs combine multiple layers of convolutions and pooling (which reduces data size) to build complex understanding, like recognizing faces or objects in photos.


Recurrent Neural Networks (RNN)

RNNs are designed for sequential data where the order matters, such as sentences or time series. They have loops that allow information to be passed from one step to the next, giving them a kind of memory. This helps in understanding context—for example, in language translation or speech recognition. RNNs can remember previous words to interpret the meaning of the next word.


Simple ANN Example in Python

from tensorflow.keras.models import Sequential
from tensorflow.keras.layers import Dense

model = Sequential()
model.add(Dense(64, activation='relu', input_shape=(10,)))  # Input with 10 features
model.add(Dense(1, activation='sigmoid'))  # Binary output

model.compile(optimizer='adam', loss='binary_crossentropy', metrics=['accuracy'])
model.fit(X_train, y_train, epochs=10, batch_size=32)


UVCE,
K. R Circle,
Bengaluru 01