SlideShare a Scribd company logo
Date : 14ᵗʰ December 2023
Kathmandu Model Secondary
School
Artificial Neural
Networking
By: Rij Amatya
Pragyan Shrestha
Kritika Silwal
Branch of Machine learning models
Was proposed in 1944
Information processing model
A computational model inspired by the human brain's structure and function.
Follows principles of neuronal organization
Makes prediction based on what they’ve learned
Applied in the concept of AI
Learns by examples
Artificial Neural
Networking(ANN)
INTRODUCTION
Artificial Neural
Networks
Core of deep learning
Image recognition, NLP, Robotics, etc.
3 different layers
Input Layer
1.
Hidden Layer
2.
Output layer
3.
Sometimes referred to as the MLP (Multi-Layer
Perceptron).
The amount of data produced
has increased, and big data has
become a buzzword. Big data
made it easy to train ANNs.
While classical machine learning
algorithms fell short of analyzing
big data, artificial neural
networks performed well on big
data.
Biological Neural
Networks
A neural network consists of the cell body
containing the nucleus, many-branched dendrites,
and a long axon. As you can see in the image above,
the length of the axon is much longer than the cell
body. The axons are divided into many branches that
combine with the dendrites or cell bodies of other
neurons.
Biological neurons generate electrical signals that
travel along axons. If a neuron gets enough
stimulation, it fires. In general, biological neurons
work this way. This work may seem simple, but
billions of neural networks can be managed by
connecting neurons.
Artificial Neurons – Associated with the weights which contain
information about the input signal
Compromises interconnected nodes (neurons) organized in layers
Interconnected to each other by weighted links
As you can see, the inputs and output are
numbers and each input has a weight in this
architecture. These weighted inputs are
summed first, and then a bias is added. This
sum is passed through a step function. This
function can be, for example, a sign function.
Perceptron
A perceptron is the smallest unit of a neural
network. This architecture was developed by
Frank Rosenblatt in 1957. This is a simple
model of biological neuron in Artificial Neural
Network
An algorithm for binary classification can be
used. A simple threshold logic unit (TLU) is a
single linear function. This simple approach has
paved the way for developing AI tools like
ChatGPT.
The image shows a
perceptron with two inputs
and three outputs,
connected via a dense
layer. It is used for multi-
label classification.
How to train a perception?
First, assign a random weight to each input. The sum of the weighted inputs
is found, and then a bias is added to this sum. Note that, when one neuron
triggers another neuron frequently, the connection between them becomes
stronger. Inputs passing through neurons produce an output. This output is
a prediction. The actual value is compared with the prediction and the error
is calculated. Weights are updated to make predictions with fewer errors.
Perceptron was a nice approach, but failed to solve some simple problems
like XOR. To overcome the limitations of the perceptrons, the multilayer
perceptron was developed. Let’s dive into multilayer perceptrons.
If you learn how a perceptron is trained, you will have a better
understanding of how ANNs work. Let’s now discuss how to
train a perceptron.
Multi-layer
Perceptron
Multilayer perceptron consist of an input layer,
a hidden layer, and an output layer. As you can
see in the image above, there is a hidden layer.
If there is more than one hidden layer, it is
called a deep neural network. This is where
deep learning comes into play. Deep learning
became popular with the development of
modern AI architectures.
In short, the inputs go through neural networks and a prediction is
made. But, how to improve the prediction of a neural network? This
is where the backpropagation algorithm comes in. This algorithm
takes a neural network’s output error and then propagates that
error backward through the network. So, the weights are updated.
And this cycle continues until the best prediction is obtained.
A P P L I C A T I O N
ANNs power language translation, sentiment
analysis, and chatbots in NLP applications.
Natural Language Processing (NLP):
ANNs analyze medical images for disease detection,
assisting in diagnostics using techniques like computer-
aided diagnosis
Medical Diagnosis:
ANNs predict stock prices, market trends, and
assess financial risk by analyzing historical data.
Financial Forecasting:
ANNs contribute to the development of self-driving cars
by enabling object detection, lane keeping, and decision-
making.
Autonomous Vehicles:
ANNs are used for image and speech recognition,
enabling applications like facial recognition and voice
assistants.
Image and Speech Recognition:
12
ANNs often require large labeled datasets for
training, and obtaining such data can be challenging.
Training complex neural networks may demand
significant computational power.
Neural networks can overfit to training data,
capturing noise and hindering generalization to new
data.
Issues related to bias, fairness, and accountability
arise, especially in sensitive applications or high-
stakes decision-making.
CHALLENGES
ANNs require large amounts of labeled data for
effective training, and performance may suffer
with insufficient or biased datasets.
Training deep networks can be computationally
demanding
Models memorize training data noise rather
than learning general patterns.
Neural networks are often perceived as black-
box models, making it challenging to interpret
their decision-making processes.
LIMITATIONS
C H A L L E N G E S A N D L I M I T A T I O N S
14

More Related Content

Similar to Artificial Neural Networking (20)

DOCX
Neural network
Guvera Ganesh
 
PDF
Introduction to Neural Networks
Databricks
 
PPTX
14. mohsin dalvi artificial neural networks presentation
Purnesh Aloni
 
PPTX
mohsin dalvi artificial neural networks presentation
Akash Maurya
 
PPTX
Deep learning Ann(Artificial neural network)
aawezix
 
PPTX
artificialneuralnetwork-200611082546.pptx
olisahchristopher
 
PPS
Neural Networks
Ismail El Gayar
 
PPTX
Karan ppt for neural network and deep learning
KathiriyaParthiv
 
PPTX
Neural network
Faireen
 
PPTX
softcomputing.pptx
Kaviya452563
 
PPTX
تطبيق الشبكة العصبية الاصطناعية (( ANN في كشف اعطال منظومة نقل القدرة الكهربائية
ssuserfdec151
 
PDF
7 nn1-intro.ppt
Sambit Satpathy
 
PPTX
Neural network
Ramesh Giri
 
PPTX
Neural networks.ppt
SrinivashR3
 
PPTX
Introduction to ANN Principles and its Applications in Solar Energy Technology
Ali Al-Waeli
 
PPT
Artificial-Neural-Networks.ppt
ChidanGowda1
 
PDF
Artificial Neural Networks.pdf
Bria Davis
 
PDF
Artificial Neural Networks: Introduction, Neural Network representation, Appr...
BMS Institute of Technology and Management
 
PPTX
Artificial Neural Network
NainaBhatt1
 
Neural network
Guvera Ganesh
 
Introduction to Neural Networks
Databricks
 
14. mohsin dalvi artificial neural networks presentation
Purnesh Aloni
 
mohsin dalvi artificial neural networks presentation
Akash Maurya
 
Deep learning Ann(Artificial neural network)
aawezix
 
artificialneuralnetwork-200611082546.pptx
olisahchristopher
 
Neural Networks
Ismail El Gayar
 
Karan ppt for neural network and deep learning
KathiriyaParthiv
 
Neural network
Faireen
 
softcomputing.pptx
Kaviya452563
 
تطبيق الشبكة العصبية الاصطناعية (( ANN في كشف اعطال منظومة نقل القدرة الكهربائية
ssuserfdec151
 
7 nn1-intro.ppt
Sambit Satpathy
 
Neural network
Ramesh Giri
 
Neural networks.ppt
SrinivashR3
 
Introduction to ANN Principles and its Applications in Solar Energy Technology
Ali Al-Waeli
 
Artificial-Neural-Networks.ppt
ChidanGowda1
 
Artificial Neural Networks.pdf
Bria Davis
 
Artificial Neural Networks: Introduction, Neural Network representation, Appr...
BMS Institute of Technology and Management
 
Artificial Neural Network
NainaBhatt1
 

Recently uploaded (20)

PPTX
Exploring Multilingual Embeddings for Italian Semantic Search: A Pretrained a...
Sease
 
PPT
01 presentation finyyyal معهد معايره.ppt
eltohamym057
 
PPTX
apidays Munich 2025 - Building Telco-Aware Apps with Open Gateway APIs, Subhr...
apidays
 
PPTX
Climate Action.pptx action plan for climate
justfortalabat
 
PPT
Data base management system Transactions.ppt
gandhamcharan2006
 
DOC
MATRIX_AMAN IRAWAN_20227479046.docbbbnnb
vanitafiani1
 
PPTX
加拿大尼亚加拉学院毕业证书{Niagara在读证明信Niagara成绩单修改}复刻
Taqyea
 
PDF
Incident Response and Digital Forensics Certificate
VICTOR MAESTRE RAMIREZ
 
PDF
apidays Helsinki & North 2025 - How (not) to run a Graphql Stewardship Group,...
apidays
 
PPTX
Module-5-Measures-of-Central-Tendency-Grouped-Data-1.pptx
lacsonjhoma0407
 
PDF
apidays Helsinki & North 2025 - API-Powered Journeys: Mobility in an API-Driv...
apidays
 
PDF
Context Engineering vs. Prompt Engineering, A Comprehensive Guide.pdf
Tamanna
 
PDF
What does good look like - CRAP Brighton 8 July 2025
Jan Kierzyk
 
PPTX
Usage of Power BI for Pharmaceutical Data analysis.pptx
Anisha Herala
 
PDF
WEF_Future_of_Global_Fintech_Second_Edition_2025.pdf
AproximacionAlFuturo
 
PDF
MusicVideoProjectRubric Animation production music video.pdf
ALBERTIANCASUGA
 
PPTX
Resmed Rady Landis May 4th - analytics.pptx
Adrian Limanto
 
PDF
Avatar for apidays apidays PRO June 07, 2025 0 5 apidays Helsinki & North 2...
apidays
 
PPTX
AI Project Cycle and Ethical Frameworks.pptx
RiddhimaVarshney1
 
PPTX
This PowerPoint presentation titled "Data Visualization: Turning Data into In...
HemaDivyaKantamaneni
 
Exploring Multilingual Embeddings for Italian Semantic Search: A Pretrained a...
Sease
 
01 presentation finyyyal معهد معايره.ppt
eltohamym057
 
apidays Munich 2025 - Building Telco-Aware Apps with Open Gateway APIs, Subhr...
apidays
 
Climate Action.pptx action plan for climate
justfortalabat
 
Data base management system Transactions.ppt
gandhamcharan2006
 
MATRIX_AMAN IRAWAN_20227479046.docbbbnnb
vanitafiani1
 
加拿大尼亚加拉学院毕业证书{Niagara在读证明信Niagara成绩单修改}复刻
Taqyea
 
Incident Response and Digital Forensics Certificate
VICTOR MAESTRE RAMIREZ
 
apidays Helsinki & North 2025 - How (not) to run a Graphql Stewardship Group,...
apidays
 
Module-5-Measures-of-Central-Tendency-Grouped-Data-1.pptx
lacsonjhoma0407
 
apidays Helsinki & North 2025 - API-Powered Journeys: Mobility in an API-Driv...
apidays
 
Context Engineering vs. Prompt Engineering, A Comprehensive Guide.pdf
Tamanna
 
What does good look like - CRAP Brighton 8 July 2025
Jan Kierzyk
 
Usage of Power BI for Pharmaceutical Data analysis.pptx
Anisha Herala
 
WEF_Future_of_Global_Fintech_Second_Edition_2025.pdf
AproximacionAlFuturo
 
MusicVideoProjectRubric Animation production music video.pdf
ALBERTIANCASUGA
 
Resmed Rady Landis May 4th - analytics.pptx
Adrian Limanto
 
Avatar for apidays apidays PRO June 07, 2025 0 5 apidays Helsinki & North 2...
apidays
 
AI Project Cycle and Ethical Frameworks.pptx
RiddhimaVarshney1
 
This PowerPoint presentation titled "Data Visualization: Turning Data into In...
HemaDivyaKantamaneni
 
Ad

Artificial Neural Networking

  • 1. Date : 14ᵗʰ December 2023 Kathmandu Model Secondary School Artificial Neural Networking By: Rij Amatya Pragyan Shrestha Kritika Silwal
  • 2. Branch of Machine learning models Was proposed in 1944 Information processing model A computational model inspired by the human brain's structure and function. Follows principles of neuronal organization Makes prediction based on what they’ve learned Applied in the concept of AI Learns by examples Artificial Neural Networking(ANN) INTRODUCTION
  • 3. Artificial Neural Networks Core of deep learning Image recognition, NLP, Robotics, etc. 3 different layers Input Layer 1. Hidden Layer 2. Output layer 3. Sometimes referred to as the MLP (Multi-Layer Perceptron).
  • 4. The amount of data produced has increased, and big data has become a buzzword. Big data made it easy to train ANNs. While classical machine learning algorithms fell short of analyzing big data, artificial neural networks performed well on big data.
  • 5. Biological Neural Networks A neural network consists of the cell body containing the nucleus, many-branched dendrites, and a long axon. As you can see in the image above, the length of the axon is much longer than the cell body. The axons are divided into many branches that combine with the dendrites or cell bodies of other neurons. Biological neurons generate electrical signals that travel along axons. If a neuron gets enough stimulation, it fires. In general, biological neurons work this way. This work may seem simple, but billions of neural networks can be managed by connecting neurons.
  • 6. Artificial Neurons – Associated with the weights which contain information about the input signal Compromises interconnected nodes (neurons) organized in layers Interconnected to each other by weighted links
  • 7. As you can see, the inputs and output are numbers and each input has a weight in this architecture. These weighted inputs are summed first, and then a bias is added. This sum is passed through a step function. This function can be, for example, a sign function. Perceptron A perceptron is the smallest unit of a neural network. This architecture was developed by Frank Rosenblatt in 1957. This is a simple model of biological neuron in Artificial Neural Network
  • 8. An algorithm for binary classification can be used. A simple threshold logic unit (TLU) is a single linear function. This simple approach has paved the way for developing AI tools like ChatGPT. The image shows a perceptron with two inputs and three outputs, connected via a dense layer. It is used for multi- label classification.
  • 9. How to train a perception? First, assign a random weight to each input. The sum of the weighted inputs is found, and then a bias is added to this sum. Note that, when one neuron triggers another neuron frequently, the connection between them becomes stronger. Inputs passing through neurons produce an output. This output is a prediction. The actual value is compared with the prediction and the error is calculated. Weights are updated to make predictions with fewer errors. Perceptron was a nice approach, but failed to solve some simple problems like XOR. To overcome the limitations of the perceptrons, the multilayer perceptron was developed. Let’s dive into multilayer perceptrons. If you learn how a perceptron is trained, you will have a better understanding of how ANNs work. Let’s now discuss how to train a perceptron.
  • 10. Multi-layer Perceptron Multilayer perceptron consist of an input layer, a hidden layer, and an output layer. As you can see in the image above, there is a hidden layer. If there is more than one hidden layer, it is called a deep neural network. This is where deep learning comes into play. Deep learning became popular with the development of modern AI architectures.
  • 11. In short, the inputs go through neural networks and a prediction is made. But, how to improve the prediction of a neural network? This is where the backpropagation algorithm comes in. This algorithm takes a neural network’s output error and then propagates that error backward through the network. So, the weights are updated. And this cycle continues until the best prediction is obtained.
  • 12. A P P L I C A T I O N ANNs power language translation, sentiment analysis, and chatbots in NLP applications. Natural Language Processing (NLP): ANNs analyze medical images for disease detection, assisting in diagnostics using techniques like computer- aided diagnosis Medical Diagnosis: ANNs predict stock prices, market trends, and assess financial risk by analyzing historical data. Financial Forecasting: ANNs contribute to the development of self-driving cars by enabling object detection, lane keeping, and decision- making. Autonomous Vehicles: ANNs are used for image and speech recognition, enabling applications like facial recognition and voice assistants. Image and Speech Recognition: 12
  • 13. ANNs often require large labeled datasets for training, and obtaining such data can be challenging. Training complex neural networks may demand significant computational power. Neural networks can overfit to training data, capturing noise and hindering generalization to new data. Issues related to bias, fairness, and accountability arise, especially in sensitive applications or high- stakes decision-making. CHALLENGES ANNs require large amounts of labeled data for effective training, and performance may suffer with insufficient or biased datasets. Training deep networks can be computationally demanding Models memorize training data noise rather than learning general patterns. Neural networks are often perceived as black- box models, making it challenging to interpret their decision-making processes. LIMITATIONS C H A L L E N G E S A N D L I M I T A T I O N S
  • 14. 14