SlideShare a Scribd company logo
4/3/2020 1
INTRODUCTION TO
NEURAL NETWORKS
BY
Mr.A.Arulkumar,
Assistant Professor
Dept. of Mechatronics Engg
Kamaraj College of Engg &Technology
Email :
arulkumarmtr@kamarajengg.edu.in
4/3/2020 2
DEFINITION OF NEURAL
NETWORKS
According to the DARPA Neural Network Study
(1988, AFCEA International Press, p. 60):
• ... a neural network is a system composed of
many simple processing elements operating in
parallel whose function is determined by network
structure, connection strengths, and the
processing performed at computing elements or
nodes.
4/3/2020 3
According to Haykin (1994), p. 2:
• A neural network is a massively parallel
distributed processor that has a natural
propensity for storing experiential knowledge
and making it available for use. It resembles
the brain in two respects:
– Knowledge is acquired by the network
through a learning process.
– Interneuron connection strengths known
as synaptic weights are used to store the
knowledge.
4/3/2020 4
BRAIN COMPUTATION
The human brain contains about 10 billion
nerve cells, or neurons. On average, each
neuron is connected to other neurons through
approximately 10,000 synapses.
4/3/2020 5
INTERCONNECTIONS IN
BRAIN
4/3/2020 6
BIOLOGICAL (MOTOR) NEURON
4/3/2020 7
• Information-processing system.
• Neurons process the information.
• The signals are transmitted by means of
connection links.
• The links possess an associated weight.
• The output signal is obtained by applying
activations to the net input.
ARTIFICIAL NEURAL NET
4/3/2020 8
MOTIVATION FOR NEURAL NET
• Scientists are challenged to use machines
more effectively for tasks currently solved by
humans.
• Symbolic rules don't reflect processes actually
used by humans.
• Traditional computing excels in many areas,
but not in others.
4/3/2020 9
The major areas being:
• Massive parallelism,
• Distributed representation and computation,
• Learning ability,
• Generalization ability,
• Adaptivity,
• Inherent contextual information processing,
• Fault tolerance,
• Low energy consumption.
4/3/2020 10
ARTIFICIAL NEURAL NET
X2
X1
W2
W1
Y
The figure shows a simple artificial neural net
with two input neurons (X1, X2) and one output
neuron (Y). The inter connected weights are
given by W1 and W2.
4/3/2020 11
ASSOCIATION OF BIOLOGICAL NET
WITH ARTIFICIAL NET
4/3/2020 12
The neuron is the basic information processing
unit of a NN. It consists of:
1. A set of links, describing the neuron inputs, with
weights W1, W2, …, Wm.
2. An adder function (linear combiner) for
computing the weighted sum of the inputs (real
numbers):
3. Activation function for limiting the amplitude of
the neuron output.
j
j
jXWu
m
1


)(uy b
PROCESSING OF AN ARTIFICIAL NET
4/3/2020 13
BIAS OF AN ARTIFICIAL NEURON
The bias value is added to the weighted sum
∑wixi so that we can transform it from the origin.
Yin = ∑wixi + b, where b is the bias
x1-x2=0
x1-x2= 1
x1
x2
x1-x2= -1
4/3/2020 14
MULTI LAYER ARTIFICIAL
NEURAL NET
 INPUT: records without class attribute with
normalized attributes values.
 INPUT VECTOR: X = { x1, x2, …, xn}
where n is the number of (non-class) attributes.
 INPUT LAYER: there are as many nodes as non-
class attributes, i.e. as the length of the input vector.
 HIDDEN LAYER: the number of nodes in the hidden
layer and the number of hidden layers depends on
implementation.
4/3/2020 15
OPERATION OF A NEURAL NET
-
f
Weighted
sum
Input
vector x
Output y
Activation
function
Weight
vector
w

w0j
w1j
wnj
x0
x1
xn
Bias
4/3/2020 16
WEIGHT AND BIAS UPDATION
Per Sample Updating
- updating weights and biases after the
presentation of each sample.
Per Training Set Updating (Epoch or Iteration)
- weight and bias increments could be
accumulated in variables and the weights and
biases updated after all the samples of the
training set have been presented.
4/3/2020 17
STOPPING CONDITION
• All change in weights (wij) in the previous epoch are
below some threshold, or
• The percentage of samples misclassified in the
previous epoch is below some threshold, or
• A pre-specified number of epochs has expired.
• In practice, several hundreds of thousands of epochs
may be required before the weights will converge.
4/3/2020 18
BUILDING BLOCKS OF ARTIFICIAL
NEURAL NET
 Network Architecture (Connection
between Neurons)
 Setting the Weights (Training)
 Activation Function
4/3/2020 19
4/3/2020 20
LAYER PROPERTIES
• Input Layer: Each input unit may be
designated by an attribute value possessed
by the instance.
• Hidden Layer: Not directly observable,
provides nonlinearities for the network.
• Output Layer: Encodes possible values.
4/3/2020 21
TRAINING PROCESS
 Supervised Training - Providing the network
with a series of sample inputs and comparing
the output with the expected responses.
 Unsupervised Training - Most similar input
vector is assigned to the same output unit.
 Reinforcement Training - Right answer is not
provided but indication of whether ‘right’ or
‘wrong’ is provided.
4/3/2020 22
DEFINITION OF SUPERVISED
LEARNING NETWORKS
• Training and test data sets
• Training set; input & target are
specified
4/3/2020 23
UNSUPERVISED LEARNING
 No help from the outside.
 No training data, no information available on the
desired output.
 Learning by doing.
 Used to pick out structure in the input:
• Clustering,
• Reduction of dimensionality  compression.
 Example: Kohonen’s Learning Law.
4/3/2020 24
ACTIVATION FUNCTION
ACTIVATION LEVEL – DISCRETE OR
CONTINUOUS
HARD LIMIT FUCNTION (DISCRETE)
 Binary Activation function
Bipolar activation function
Identity function
SIGMOIDAL ACTIVATION FUNCTION (CONTINUOUS)
Binary Sigmoidal activation function
Bipolar Sigmoidal activation function
4/3/2020 25
ACTIVATION FUNCTION
Activation functions:
(A)Identity
(B)Binary step
(C)Bipolar step
(D)Binary sigmoidal
(E)Bipolar sigmoidal
(F)Ramp
4/3/2020 26
CONSTRUCTING ANN
•Determine the network properties:
•Network topology
•Types of connectivity
•Order of connections
•Weight range
•Determine the node properties:
•Activation range
•Determine the system dynamics
•Weight initialization scheme
•Activation – calculating formula
•Learning rule
4/3/2020 27
PROBLEM SOLVING
 Select a suitable NN model based on the nature
of the problem.
 Construct a NN according to the characteristics
of the application domain.
 Train the neural network with the learning
procedure of the selected model.
 Use the trained network for making inference or
solving problems.
4/3/2020 28
NEURAL NETWORKS
 Neural Network learns by adjusting the
weights so as to be able to correctly classify
the training data and hence, after testing
phase, to classify unknown data.
 Neural Network needs long time for training.
 Neural Network has a high tolerance to
noisy and incomplete data.
4/3/2020 29
SALIENT FEATURES OF ANN
•Adaptive learning
•Self-organization
•Real-time operation
•Fault tolerance via redundant information
coding
•Massive parallelism
•Learning and generalizing ability
•Distributed representation
4/3/2020 30
McCULLOCH–PITTS NEURON
Neurons are sparsely and randomly connected
Firing state is binary (1 = firing, 0 = not firing)
All but one neuron are excitatory (tend to increase
voltage of other cells)
-One inhibitory neuron connects to all other
neurons
-It functions to regulate network activity
(prevent too many firings)
4/3/2020 31
LINEAR SEPARABILITY
Linear separability is the concept wherein
the separation of the input space into
regions is based on whether the network
response is positive or negative.
Consider a network having
positive response in the first
quadrant and negative
response in all other quadrants
(AND function) with either
binary or bipolar data, then the
decision line is drawn
separating the positive
response region from the
negative response region.
4/3/2020 32
HEBB NETWORK
Donald Hebb stated in 1949 that in the brain, the
learning is performed by the change in the synaptic
gap. Hebb explained it:
“When an axon of cell A is near enough to excite
cell B, and repeatedly or permanently takes
place in firing it, some growth process or
metabolic change takes place in one or both the
cells such that A’s efficiency, as one of the cells
firing B, is increased.”
4/3/2020 33
• The weights between neurons whose activities
are positively correlated are increased:
• Associative memory is produced automatically
)x,x(ncorrelatio~
dt
dw
ji
ij
HEBB LEARNING
The Hebb rule can be used for pattern
association, pattern categorization, pattern
classification and over a range of other areas.
4/3/2020 34
FEW APPLICATIONS OF NEURAL
NETWORKS

More Related Content

PPTX
Neural Network - Feed Forward - Back Propagation Visualization
Traian Morar
 
DOC
Sistemet operative
Ajla Hasani
 
PPTX
COURS - Systèmes Logiques et Architecture des Ordinateurs-2019.pptx
MohamedAmineZiadi1
 
PPTX
Introduction Of Artificial neural network
Nagarajan
 
PDF
Initiation à l'informatique avec windows 7
Théodoric THEOTISTE
 
PPS
Neural Networks
Ismail El Gayar
 
PDF
188529811 elektronika-1
Xhelal Bislimi
 
PPTX
CONVOLUTIONAL NEURAL NETWORK
Md Rajib Bhuiyan
 
Neural Network - Feed Forward - Back Propagation Visualization
Traian Morar
 
Sistemet operative
Ajla Hasani
 
COURS - Systèmes Logiques et Architecture des Ordinateurs-2019.pptx
MohamedAmineZiadi1
 
Introduction Of Artificial neural network
Nagarajan
 
Initiation à l'informatique avec windows 7
Théodoric THEOTISTE
 
Neural Networks
Ismail El Gayar
 
188529811 elektronika-1
Xhelal Bislimi
 
CONVOLUTIONAL NEURAL NETWORK
Md Rajib Bhuiyan
 

What's hot (13)

PPTX
Artificial Neural Network
Iman Ardekani
 
PPTX
Artificial Neural Network
Muhammad Ishaq
 
DOCX
Sistemet operative so
Rebecca Simpson
 
PDF
Chapitre 1 Définitions et vocabulaires de base.pdf
YounesAziz3
 
PDF
Associative Memory using NN (Soft Computing)
Amit Kumar Rathi
 
PPTX
Artificial neural network by arpit_sharma
Er. Arpit Sharma
 
PPTX
Le codage de huffman
Mohamed Alaeddin Turki
 
PDF
La programmation modulaire en Python
ABDESSELAM ARROU
 
PPTX
Artificial Neural Network(Artificial intelligence)
spartacus131211
 
PDF
Série de TD 2 Les Diagrammes UML CORRIGE (1).pdf
TasnimMehrabi
 
DOCX
Examen principal - Fondement Multimedia - correction
Ines Ouaz
 
PDF
Convolutional Neural Networks (CNN)
Gaurav Mittal
 
PPTX
HYRJE NE SKRATCH-DHE SB
tushi8
 
Artificial Neural Network
Iman Ardekani
 
Artificial Neural Network
Muhammad Ishaq
 
Sistemet operative so
Rebecca Simpson
 
Chapitre 1 Définitions et vocabulaires de base.pdf
YounesAziz3
 
Associative Memory using NN (Soft Computing)
Amit Kumar Rathi
 
Artificial neural network by arpit_sharma
Er. Arpit Sharma
 
Le codage de huffman
Mohamed Alaeddin Turki
 
La programmation modulaire en Python
ABDESSELAM ARROU
 
Artificial Neural Network(Artificial intelligence)
spartacus131211
 
Série de TD 2 Les Diagrammes UML CORRIGE (1).pdf
TasnimMehrabi
 
Examen principal - Fondement Multimedia - correction
Ines Ouaz
 
Convolutional Neural Networks (CNN)
Gaurav Mittal
 
HYRJE NE SKRATCH-DHE SB
tushi8
 
Ad

Similar to Neural networks (20)

PPT
Neural networks1
Mohan Raj
 
PPT
SET-02_SOCS_ESE-DEC23__B.Tech%20(CSE-H+NH)-AIML_5_CSAI300
dhruvkeshav123
 
PPTX
Artificial neural networks
ShwethaShreeS
 
PPTX
Artificial Neural Networks ppt.pptx for final sem cse
NaveenBhajantri1
 
PPT
Machine Learning Neural Networks Artificial
webinartrainer
 
PPT
Machine Learning Neural Networks Artificial Intelligence
webinartrainer
 
PPT
ai7.ppt
MrHacker61
 
PPT
ANNs have been widely used in various domains for: Pattern recognition Funct...
vijaym148
 
PPT
Game theory.pdf textbooks content Artificical
webinartrainer
 
PPT
ai...........................................
abhisheknagaraju126
 
PPT
ai7.ppt
qwerty432737
 
PPTX
Basics of Artificial Neural Network
Subham Preetam
 
PPT
ai7 (1) Artificial Neural Network Intro .ppt
AiniBasit
 
PPTX
Neural network
KRISH na TimeTraveller
 
PPT
INTRODUCTION TO ARTIFICIAL INTELLIGENCE.
SoumitraKundu4
 
PDF
A survey research summary on neural networks
eSAT Publishing House
 
PPTX
Artificial Neural Network
Manasa Mona
 
PPT
ann ppt , multilayer perceptron. presentation on
SoumabhaBhim
 
PDF
Employing Neocognitron Neural Network Base Ensemble Classifiers To Enhance Ef...
cscpconf
 
PPTX
ANN.pptx bgyikkl jyrf hfuk kiyfvj jiyfv kuyfcv
18X5F8NDeekshitha
 
Neural networks1
Mohan Raj
 
SET-02_SOCS_ESE-DEC23__B.Tech%20(CSE-H+NH)-AIML_5_CSAI300
dhruvkeshav123
 
Artificial neural networks
ShwethaShreeS
 
Artificial Neural Networks ppt.pptx for final sem cse
NaveenBhajantri1
 
Machine Learning Neural Networks Artificial
webinartrainer
 
Machine Learning Neural Networks Artificial Intelligence
webinartrainer
 
ai7.ppt
MrHacker61
 
ANNs have been widely used in various domains for: Pattern recognition Funct...
vijaym148
 
Game theory.pdf textbooks content Artificical
webinartrainer
 
ai...........................................
abhisheknagaraju126
 
ai7.ppt
qwerty432737
 
Basics of Artificial Neural Network
Subham Preetam
 
ai7 (1) Artificial Neural Network Intro .ppt
AiniBasit
 
Neural network
KRISH na TimeTraveller
 
INTRODUCTION TO ARTIFICIAL INTELLIGENCE.
SoumitraKundu4
 
A survey research summary on neural networks
eSAT Publishing House
 
Artificial Neural Network
Manasa Mona
 
ann ppt , multilayer perceptron. presentation on
SoumabhaBhim
 
Employing Neocognitron Neural Network Base Ensemble Classifiers To Enhance Ef...
cscpconf
 
ANN.pptx bgyikkl jyrf hfuk kiyfvj jiyfv kuyfcv
18X5F8NDeekshitha
 
Ad

More from Arul Kumar (10)

PPTX
Programmable Logic Controller
Arul Kumar
 
PDF
Smart Grid Applications
Arul Kumar
 
PDF
Smart Grid Domains and Zones
Arul Kumar
 
PDF
Advanced metering infrastructure (AMI)
Arul Kumar
 
PDF
Smart Grid Advanced Metering Infrastructure
Arul Kumar
 
PPTX
Smart Grid communications
Arul Kumar
 
PPTX
Types of Process control system
Arul Kumar
 
PPT
Fuzzy logic control
Arul Kumar
 
PPTX
Embedded networking
Arul Kumar
 
PPTX
Scada
Arul Kumar
 
Programmable Logic Controller
Arul Kumar
 
Smart Grid Applications
Arul Kumar
 
Smart Grid Domains and Zones
Arul Kumar
 
Advanced metering infrastructure (AMI)
Arul Kumar
 
Smart Grid Advanced Metering Infrastructure
Arul Kumar
 
Smart Grid communications
Arul Kumar
 
Types of Process control system
Arul Kumar
 
Fuzzy logic control
Arul Kumar
 
Embedded networking
Arul Kumar
 
Scada
Arul Kumar
 

Recently uploaded (20)

PDF
Chad Ayach - A Versatile Aerospace Professional
Chad Ayach
 
PDF
Natural_Language_processing_Unit_I_notes.pdf
sanguleumeshit
 
PPT
1. SYSTEMS, ROLES, AND DEVELOPMENT METHODOLOGIES.ppt
zilow058
 
PDF
top-5-use-cases-for-splunk-security-analytics.pdf
yaghutialireza
 
PPTX
business incubation centre aaaaaaaaaaaaaa
hodeeesite4
 
PPTX
sunil mishra pptmmmmmmmmmmmmmmmmmmmmmmmmm
singhamit111
 
PDF
CAD-CAM U-1 Combined Notes_57761226_2025_04_22_14_40.pdf
shailendrapratap2002
 
PPTX
Module2 Data Base Design- ER and NF.pptx
gomathisankariv2
 
DOCX
SAR - EEEfdfdsdasdsdasdasdasdasdasdasdasda.docx
Kanimozhi676285
 
PPTX
quantum computing transition from classical mechanics.pptx
gvlbcy
 
PDF
Cryptography and Information :Security Fundamentals
Dr. Madhuri Jawale
 
PDF
Unit I Part II.pdf : Security Fundamentals
Dr. Madhuri Jawale
 
PDF
20ME702-Mechatronics-UNIT-1,UNIT-2,UNIT-3,UNIT-4,UNIT-5, 2025-2026
Mohanumar S
 
PDF
STUDY OF NOVEL CHANNEL MATERIALS USING III-V COMPOUNDS WITH VARIOUS GATE DIEL...
ijoejnl
 
PDF
Advanced LangChain & RAG: Building a Financial AI Assistant with Real-Time Data
Soufiane Sejjari
 
PPTX
MT Chapter 1.pptx- Magnetic particle testing
ABCAnyBodyCanRelax
 
PPTX
Civil Engineering Practices_BY Sh.JP Mishra 23.09.pptx
bineetmishra1990
 
PDF
67243-Cooling and Heating & Calculation.pdf
DHAKA POLYTECHNIC
 
PDF
Zero carbon Building Design Guidelines V4
BassemOsman1
 
PDF
67243-Cooling and Heating & Calculation.pdf
DHAKA POLYTECHNIC
 
Chad Ayach - A Versatile Aerospace Professional
Chad Ayach
 
Natural_Language_processing_Unit_I_notes.pdf
sanguleumeshit
 
1. SYSTEMS, ROLES, AND DEVELOPMENT METHODOLOGIES.ppt
zilow058
 
top-5-use-cases-for-splunk-security-analytics.pdf
yaghutialireza
 
business incubation centre aaaaaaaaaaaaaa
hodeeesite4
 
sunil mishra pptmmmmmmmmmmmmmmmmmmmmmmmmm
singhamit111
 
CAD-CAM U-1 Combined Notes_57761226_2025_04_22_14_40.pdf
shailendrapratap2002
 
Module2 Data Base Design- ER and NF.pptx
gomathisankariv2
 
SAR - EEEfdfdsdasdsdasdasdasdasdasdasdasda.docx
Kanimozhi676285
 
quantum computing transition from classical mechanics.pptx
gvlbcy
 
Cryptography and Information :Security Fundamentals
Dr. Madhuri Jawale
 
Unit I Part II.pdf : Security Fundamentals
Dr. Madhuri Jawale
 
20ME702-Mechatronics-UNIT-1,UNIT-2,UNIT-3,UNIT-4,UNIT-5, 2025-2026
Mohanumar S
 
STUDY OF NOVEL CHANNEL MATERIALS USING III-V COMPOUNDS WITH VARIOUS GATE DIEL...
ijoejnl
 
Advanced LangChain & RAG: Building a Financial AI Assistant with Real-Time Data
Soufiane Sejjari
 
MT Chapter 1.pptx- Magnetic particle testing
ABCAnyBodyCanRelax
 
Civil Engineering Practices_BY Sh.JP Mishra 23.09.pptx
bineetmishra1990
 
67243-Cooling and Heating & Calculation.pdf
DHAKA POLYTECHNIC
 
Zero carbon Building Design Guidelines V4
BassemOsman1
 
67243-Cooling and Heating & Calculation.pdf
DHAKA POLYTECHNIC
 

Neural networks

  • 1. 4/3/2020 1 INTRODUCTION TO NEURAL NETWORKS BY Mr.A.Arulkumar, Assistant Professor Dept. of Mechatronics Engg Kamaraj College of Engg &Technology Email : [email protected]
  • 2. 4/3/2020 2 DEFINITION OF NEURAL NETWORKS According to the DARPA Neural Network Study (1988, AFCEA International Press, p. 60): • ... a neural network is a system composed of many simple processing elements operating in parallel whose function is determined by network structure, connection strengths, and the processing performed at computing elements or nodes.
  • 3. 4/3/2020 3 According to Haykin (1994), p. 2: • A neural network is a massively parallel distributed processor that has a natural propensity for storing experiential knowledge and making it available for use. It resembles the brain in two respects: – Knowledge is acquired by the network through a learning process. – Interneuron connection strengths known as synaptic weights are used to store the knowledge.
  • 4. 4/3/2020 4 BRAIN COMPUTATION The human brain contains about 10 billion nerve cells, or neurons. On average, each neuron is connected to other neurons through approximately 10,000 synapses.
  • 7. 4/3/2020 7 • Information-processing system. • Neurons process the information. • The signals are transmitted by means of connection links. • The links possess an associated weight. • The output signal is obtained by applying activations to the net input. ARTIFICIAL NEURAL NET
  • 8. 4/3/2020 8 MOTIVATION FOR NEURAL NET • Scientists are challenged to use machines more effectively for tasks currently solved by humans. • Symbolic rules don't reflect processes actually used by humans. • Traditional computing excels in many areas, but not in others.
  • 9. 4/3/2020 9 The major areas being: • Massive parallelism, • Distributed representation and computation, • Learning ability, • Generalization ability, • Adaptivity, • Inherent contextual information processing, • Fault tolerance, • Low energy consumption.
  • 10. 4/3/2020 10 ARTIFICIAL NEURAL NET X2 X1 W2 W1 Y The figure shows a simple artificial neural net with two input neurons (X1, X2) and one output neuron (Y). The inter connected weights are given by W1 and W2.
  • 11. 4/3/2020 11 ASSOCIATION OF BIOLOGICAL NET WITH ARTIFICIAL NET
  • 12. 4/3/2020 12 The neuron is the basic information processing unit of a NN. It consists of: 1. A set of links, describing the neuron inputs, with weights W1, W2, …, Wm. 2. An adder function (linear combiner) for computing the weighted sum of the inputs (real numbers): 3. Activation function for limiting the amplitude of the neuron output. j j jXWu m 1   )(uy b PROCESSING OF AN ARTIFICIAL NET
  • 13. 4/3/2020 13 BIAS OF AN ARTIFICIAL NEURON The bias value is added to the weighted sum ∑wixi so that we can transform it from the origin. Yin = ∑wixi + b, where b is the bias x1-x2=0 x1-x2= 1 x1 x2 x1-x2= -1
  • 14. 4/3/2020 14 MULTI LAYER ARTIFICIAL NEURAL NET  INPUT: records without class attribute with normalized attributes values.  INPUT VECTOR: X = { x1, x2, …, xn} where n is the number of (non-class) attributes.  INPUT LAYER: there are as many nodes as non- class attributes, i.e. as the length of the input vector.  HIDDEN LAYER: the number of nodes in the hidden layer and the number of hidden layers depends on implementation.
  • 15. 4/3/2020 15 OPERATION OF A NEURAL NET - f Weighted sum Input vector x Output y Activation function Weight vector w  w0j w1j wnj x0 x1 xn Bias
  • 16. 4/3/2020 16 WEIGHT AND BIAS UPDATION Per Sample Updating - updating weights and biases after the presentation of each sample. Per Training Set Updating (Epoch or Iteration) - weight and bias increments could be accumulated in variables and the weights and biases updated after all the samples of the training set have been presented.
  • 17. 4/3/2020 17 STOPPING CONDITION • All change in weights (wij) in the previous epoch are below some threshold, or • The percentage of samples misclassified in the previous epoch is below some threshold, or • A pre-specified number of epochs has expired. • In practice, several hundreds of thousands of epochs may be required before the weights will converge.
  • 18. 4/3/2020 18 BUILDING BLOCKS OF ARTIFICIAL NEURAL NET  Network Architecture (Connection between Neurons)  Setting the Weights (Training)  Activation Function
  • 20. 4/3/2020 20 LAYER PROPERTIES • Input Layer: Each input unit may be designated by an attribute value possessed by the instance. • Hidden Layer: Not directly observable, provides nonlinearities for the network. • Output Layer: Encodes possible values.
  • 21. 4/3/2020 21 TRAINING PROCESS  Supervised Training - Providing the network with a series of sample inputs and comparing the output with the expected responses.  Unsupervised Training - Most similar input vector is assigned to the same output unit.  Reinforcement Training - Right answer is not provided but indication of whether ‘right’ or ‘wrong’ is provided.
  • 22. 4/3/2020 22 DEFINITION OF SUPERVISED LEARNING NETWORKS • Training and test data sets • Training set; input & target are specified
  • 23. 4/3/2020 23 UNSUPERVISED LEARNING  No help from the outside.  No training data, no information available on the desired output.  Learning by doing.  Used to pick out structure in the input: • Clustering, • Reduction of dimensionality  compression.  Example: Kohonen’s Learning Law.
  • 24. 4/3/2020 24 ACTIVATION FUNCTION ACTIVATION LEVEL – DISCRETE OR CONTINUOUS HARD LIMIT FUCNTION (DISCRETE)  Binary Activation function Bipolar activation function Identity function SIGMOIDAL ACTIVATION FUNCTION (CONTINUOUS) Binary Sigmoidal activation function Bipolar Sigmoidal activation function
  • 25. 4/3/2020 25 ACTIVATION FUNCTION Activation functions: (A)Identity (B)Binary step (C)Bipolar step (D)Binary sigmoidal (E)Bipolar sigmoidal (F)Ramp
  • 26. 4/3/2020 26 CONSTRUCTING ANN •Determine the network properties: •Network topology •Types of connectivity •Order of connections •Weight range •Determine the node properties: •Activation range •Determine the system dynamics •Weight initialization scheme •Activation – calculating formula •Learning rule
  • 27. 4/3/2020 27 PROBLEM SOLVING  Select a suitable NN model based on the nature of the problem.  Construct a NN according to the characteristics of the application domain.  Train the neural network with the learning procedure of the selected model.  Use the trained network for making inference or solving problems.
  • 28. 4/3/2020 28 NEURAL NETWORKS  Neural Network learns by adjusting the weights so as to be able to correctly classify the training data and hence, after testing phase, to classify unknown data.  Neural Network needs long time for training.  Neural Network has a high tolerance to noisy and incomplete data.
  • 29. 4/3/2020 29 SALIENT FEATURES OF ANN •Adaptive learning •Self-organization •Real-time operation •Fault tolerance via redundant information coding •Massive parallelism •Learning and generalizing ability •Distributed representation
  • 30. 4/3/2020 30 McCULLOCH–PITTS NEURON Neurons are sparsely and randomly connected Firing state is binary (1 = firing, 0 = not firing) All but one neuron are excitatory (tend to increase voltage of other cells) -One inhibitory neuron connects to all other neurons -It functions to regulate network activity (prevent too many firings)
  • 31. 4/3/2020 31 LINEAR SEPARABILITY Linear separability is the concept wherein the separation of the input space into regions is based on whether the network response is positive or negative. Consider a network having positive response in the first quadrant and negative response in all other quadrants (AND function) with either binary or bipolar data, then the decision line is drawn separating the positive response region from the negative response region.
  • 32. 4/3/2020 32 HEBB NETWORK Donald Hebb stated in 1949 that in the brain, the learning is performed by the change in the synaptic gap. Hebb explained it: “When an axon of cell A is near enough to excite cell B, and repeatedly or permanently takes place in firing it, some growth process or metabolic change takes place in one or both the cells such that A’s efficiency, as one of the cells firing B, is increased.”
  • 33. 4/3/2020 33 • The weights between neurons whose activities are positively correlated are increased: • Associative memory is produced automatically )x,x(ncorrelatio~ dt dw ji ij HEBB LEARNING The Hebb rule can be used for pattern association, pattern categorization, pattern classification and over a range of other areas.
  • 34. 4/3/2020 34 FEW APPLICATIONS OF NEURAL NETWORKS