SlideShare a Scribd company logo
1
Perceptron and Neural Networks
Shaik Nagur Shareef
Dept. of CSE
Vignan’s University
Output Values
Input Signals (External Stimuli)
Contents
Introduction
Neural Networks
Perceptron and Examples
Types of NN
Applications
January 20, 2019 2Shaik Nagur Shareef
Human Information processing system
 Highly complex, nonlinear, and parallel
computer.
 Has the capability to organize its structural
constituents, known as neurons, to perform
certain computations.
 Pattern recognition, Perception, and Motor
control.
 Many times faster than the fastest digital
computer in existence today
 Neurons are information-processing units in
the human brain.
January 20, 2019 3Shaik Nagur Shareef
Interconnected Neurons forms a neural (nerve) net
Human Nervous System
 Viewed as a Three-stage system.
 Central to the system is the brain, represented by the neural (nerve) net, which
continually receives information, perceives it, and makes appropriate decisions.
 The arrows pointing from pointing from left to right indicate the forward transmission.
 The arrows pointing from right to left signify the presence of feedback in the system.
 The receptors convert stimuli from the human body or the external environment into
electrical impulses that convey information to the neural net.
 The effectors convert electrical impulses generated by the neural net into discernible
responses as system outputs.
January 20, 2019 Shaik Nagur Shareef 4
Machine Information Processing System
 Neural Networks are made of Artificial Neurons,
 A Neural Network is a machine that is designed to model the way in which the brain
performs a particular task or function of interest.
 The network is usually implemented by using electronic components or is simulated in
software on a digital computer.
 Neural Networks perform useful computations through a process of learning.
January 20, 2019 Shaik Nagur Shareef 5
Biological Neuron vs Machine Neuron
January 20, 2019 Shaik Nagur Shareef 6
Neural Network
 A Neural Network is a massively parallel distributed processor made up of simple
processing units that has a natural propensity for storing experiential knowledge and
making it available for use.
 It resembles the brain in two respects:
1. Knowledge is acquired by the network from its environment through a learning process.
2. Interneuron connection strengths, known as synaptic weights, are used to store the
acquired knowledge.
 The procedure used to perform the learning process is called a learning algorithm, the
function of which is to modify the synaptic weights of the network in an orderly fashion
to attain a desired design objective.
 The modification of synaptic weights provides the traditional method for the design of
neural networks.
January 20, 2019 Shaik Nagur Shareef 7
Neural Network
January 20, 2019 Shaik Nagur Shareef 8
Artificial Neuron - [McPi43]
 In 1943, Warren McCulloch and Walter Pitts introduced one of the first artificial
neurons [McPi43].
 The main feature of their neuron model is that a weighted sum of input signals is
compared to a threshold to determine the neuron output.
 When the sum is greater than or equal to the threshold, the output is 1.
 When the sum is less than the threshold, the output is 0.
 Networks of these neurons in principle, compute any arithmetic or logical function.
 The parameters of [McPi43] networks had to be designed, as no training method was
available.
January 20, 2019 Shaik Nagur Shareef 9
Perceptron- [Rose58]
 In the late 1950s, Frank Rosenblatt and several other researchers developed a class of
neural networks called perceptrons.
 The neurons in these networks were similar to those of McCulloch and Pitts.
 Rosenblatt's key contribution was the introduction of a learning rule for training
perceptron networks to solve pattern recognition problems [Rose58].
 He proved that his learning rule will always converge to the correct network weights, if
weights exist that solve the problem.
 Learning was simple and automatic.
 Examples of proper behavior were presented to the network, which learned from its
mistakes.
 The perceptron could even learn when initialized with random values for its weights
and biases.
January 20, 2019 Shaik Nagur Shareef 10
Learning Rule
 It is a Procedure for modifying the weights and biases of a network.
 This procedure may also be referred to as a training algorithm.
 The purpose of the learning rule is to train the network to perform some task.
 There are many types of neural network learning rules.
 They fall into three broad categories:
1. Supervised learning
2. Unsupervised learning
3. Reinforcement (or graded) learning
January 20, 2019 Shaik Nagur Shareef 11
Supervised Learning
 In supervised learning, the learning rule is provided with a set of examples (the training
set) of proper network behavior:
 Here pq is an input to the network and tq is the corresponding correct (target) output.
 As the inputs are applied to the network, the network outputs are compared to the
targets.
 The learning rule is then used to adjust the weights and biases of the network in order to
move the network outputs closer to the targets.
 The perceptron learning rule falls in this supervised learning category.
January 20, 2019 Shaik Nagur Shareef 12
Unsupervised Learning
 In unsupervised learning, the weights and biases are modified in response to network
inputs only.
 There are no target outputs available.
 At first glance this might seem to be impractical.
 How can you train a network if you don't know what it is supposed to do?
 Most of these algorithms perform some kind of clustering operation.
 They learn to categorize the input patterns into a finite number of classes.
 This is especially useful in such applications as vector quantization.
January 20, 2019 Shaik Nagur Shareef 13
Reinforcement Learning
 Reinforcement learning is similar to supervised learning, except that, instead of being
provided with the correct output for each network input, the algorithm is only given a
grade.
 The grade (or score) is a measure of the network performance over some sequence of
inputs.
 This type of learning is currently much less common than supervised learning.
 It appears to be most suited to control system applications
January 20, 2019 Shaik Nagur Shareef 14
Perceptron Architecture
January 20, 2019 Shaik Nagur Shareef 15
Procedure
 consider the network weight matrix


January 20, 2019 Shaik Nagur Shareef 16
Activation Function(hardlim Function)
January 20, 2019 Shaik Nagur Shareef 17
Step Function Sigmoid Function
Single-Neuron Perceptron-Example

 Assign Values to weights and bias

January 20, 2019 Shaik Nagur Shareef 18
Single-Neuron Perceptron-Example

 Therefore, the network output will be 1 for the region above and to the
right of the decision boundary.
January 20, 2019 Shaik Nagur Shareef 19
Single-Neuron Perceptron-Example
January 20, 2019 Shaik Nagur Shareef 20
Logic Function: AND gate


January 20, 2019 Shaik Nagur Shareef 21
Weight Update Rule
January 20, 2019 Shaik Nagur Shareef 22
Weight Update Rule
January 20, 2019 Shaik Nagur Shareef 23
Training Algorithms
 Adjust neural network weights to map inputs to outputs.
 Use a set of sample patterns where the desired output (given the inputs presented) is
known.
 The purpose is to learn to generalize
Recognize features which are common to good and bad exemplars
January 20, 2019 Shaik Nagur Shareef 24
Training: Key Terms
 Epoch: Presentation of the entire training set to the neural network.
 In the case of the AND function an epoch consists of four sets of inputs being presented
to the network (i.e. [0,0], [0,1], [1,0], [1,1])
 Error: The error value is the amount by which the value output by the network differs
from the target value.
 For example, if we required the network to output 0 and it output a 1, then Error = -1
 Online training: Update weights after each sample
 Offline (batch training): Compute error over all samples
Then update weights
 Training: Backpropagation procedure
Gradient descent strategy (usual problems)
 Prediction: Compute outputs based on input vector & weights
January 20, 2019 Shaik Nagur Shareef 25
Gradient Descent Concept
 Error: Sum of squares error of inputs with current weights
 Compute rate of change of error w.r.t each weight
Which weights have greatest effect on error?
Effectively, partial derivatives of error w.r.t weights
In turn, depend on other weights => chain rule
 E = G(w)
Error as function of weights
 Find rate of change of error
Follow steepest rate of change
Change weights so that error is minimized
January 20, 2019 Shaik Nagur Shareef 26
Gradient Descent Algorithm
Gradient-Descent(training_examples, )
Each training example is a pair of the form <(x1,…xn),t> where (x1,…,xn) is the vector
of input values, and t is the target output value,  is the learning rate (e.g. 0.1)
 Initialize each wi to some small random value
 Until the termination condition is met, Do
Initialize each wi to zero
For each <(x1,…xn),t> in training_examples Do
Input the instance (x1,…,xn) to the linear unit and compute the output o
For each linear unit weight wi Do
 wi= wi +  (t-o) xi
For each linear unit weight wi Do
wi=wi+wi
January 20, 2019 Shaik Nagur Shareef 27
Gradient Descent Algorithm
January 20, 2019 Shaik Nagur Shareef 28
(w1,w2)
(w1+w1,w2 +w2)
Back-Propagation
 A training procedure which allows multi-layer feedforward Neural Networks to be
trained;
 Can theoretically perform “any” input-output mapping;
 Can learn to solve linearly inseparable problems.
 For feed-forward networks:
A continuous function can be differentiated allowing gradient-descent.
Back-propagation is an example of a gradient-descent technique.
January 20, 2019 Shaik Nagur Shareef 29
jiw  kjw 
iy jyi j k
j k
Back-Propagation Algorithm
 Initialize each wi to some small random value
 Until the termination condition is met, Do
For each training example <(x1,…xn),t> Do
Input the instance (x1,…,xn) to the network and compute the network outputs ok
For each output unit k
k=ok(1-ok)(tk-ok)
For each hidden unit h
h=oh(1-oh) k wh,k k
For each network weight w,j Do
wi,j=wi,j+wi,j where
wi,j=  j xi,j
January 20, 2019 Shaik Nagur Shareef 30
Types of Neural Networks
January 20, 2019 Shaik Nagur Shareef 31
Standard NN Recurrent NNConvolutional NN
Benefits of Neural Networks
 Nonlinearity: An artificial neuron can be linear or nonlinear.
 Input–Output Mapping: A popular paradigm of learning, called learning with a
teacher, or supervised learning, involves modification of the synaptic weights of a
neural network by applying a set of labelled training examples, or task examples.
 Adaptivity: Neural networks have a built-in capability to adapt their synaptic weights
to changes in the surrounding environment.
 Evidential Response: In the context of pattern classification, a neural network can be
designed to provide information not only about which particular pattern to select, but
also about the confidence in the decision made.
 Fault Tolerance: A neural network, implemented in hardware form, has the potential
to be inherently fault tolerant, or capable of robust computation, in the sense that its
performance degrades gracefully under adverse operating conditions.
January 20, 2019 Shaik Nagur Shareef 32
Applications
Natural language Processing
Optical Character Recognition
Speech recognition
Neural Machine Translation
Video Classification
Emotion Recognition
Face Recognition
Object Detection
Image Classification
January 20, 2019 Shaik Nagur Shareef 33
34
Thank You
Any Questions..?

More Related Content

PPSX
Perceptron (neural network)
EdutechLearners
 
PPTX
Artificial Neural Network
Prakash K
 
PPTX
Feedforward neural network
Sopheaktra YONG
 
PPT
Artificial Neural Network seminar presentation using ppt.
Mohd Faiz
 
PPT
Artificial neural network
mustafa aadel
 
PPT
backpropagation in neural networks
Akash Goel
 
PPT
Artificial neural network
AkshanshAgarwal4
 
PPTX
Artifical Neural Network and its applications
Sangeeta Tiwari
 
Perceptron (neural network)
EdutechLearners
 
Artificial Neural Network
Prakash K
 
Feedforward neural network
Sopheaktra YONG
 
Artificial Neural Network seminar presentation using ppt.
Mohd Faiz
 
Artificial neural network
mustafa aadel
 
backpropagation in neural networks
Akash Goel
 
Artificial neural network
AkshanshAgarwal4
 
Artifical Neural Network and its applications
Sangeeta Tiwari
 

What's hot (20)

PPTX
Multilayer perceptron
omaraldabash
 
PPTX
Feed forward ,back propagation,gradient descent
Muhammad Rasel
 
PPT
Perceptron
Nagarajan
 
PPTX
Associative memory network
Dr. C.V. Suresh Babu
 
ODP
Machine Learning with Decision trees
Knoldus Inc.
 
PDF
Convolutional Neural Networks (CNN)
Gaurav Mittal
 
PDF
Artificial Neural Networks Lect1: Introduction & neural computation
Mohammed Bennamoun
 
PDF
Machine Learning: Introduction to Neural Networks
Francesco Collova'
 
PPTX
Deep Learning - CNN and RNN
Ashray Bhandare
 
PPTX
Radial basis function network ppt bySheetal,Samreen and Dhanashri
sheetal katkar
 
PDF
Artificial Neural Networks Lect3: Neural Network Learning rules
Mohammed Bennamoun
 
PPTX
Machine Learning
Girish Khanzode
 
PPTX
Regularization in deep learning
Kien Le
 
PPTX
04 Multi-layer Feedforward Networks
Tamer Ahmed Farrag, PhD
 
PDF
Artificial Neural Network Lecture 6- Associative Memories & Discrete Hopfield...
Mohammed Bennamoun
 
ODP
Machine Learning With Logistic Regression
Knoldus Inc.
 
PDF
Dimensionality Reduction
Saad Elbeleidy
 
PPTX
HOPFIELD NETWORK
ankita pandey
 
PPT
Adaptive Resonance Theory
Naveen Kumar
 
PPTX
Ensemble learning
Haris Jamil
 
Multilayer perceptron
omaraldabash
 
Feed forward ,back propagation,gradient descent
Muhammad Rasel
 
Perceptron
Nagarajan
 
Associative memory network
Dr. C.V. Suresh Babu
 
Machine Learning with Decision trees
Knoldus Inc.
 
Convolutional Neural Networks (CNN)
Gaurav Mittal
 
Artificial Neural Networks Lect1: Introduction & neural computation
Mohammed Bennamoun
 
Machine Learning: Introduction to Neural Networks
Francesco Collova'
 
Deep Learning - CNN and RNN
Ashray Bhandare
 
Radial basis function network ppt bySheetal,Samreen and Dhanashri
sheetal katkar
 
Artificial Neural Networks Lect3: Neural Network Learning rules
Mohammed Bennamoun
 
Machine Learning
Girish Khanzode
 
Regularization in deep learning
Kien Le
 
04 Multi-layer Feedforward Networks
Tamer Ahmed Farrag, PhD
 
Artificial Neural Network Lecture 6- Associative Memories & Discrete Hopfield...
Mohammed Bennamoun
 
Machine Learning With Logistic Regression
Knoldus Inc.
 
Dimensionality Reduction
Saad Elbeleidy
 
HOPFIELD NETWORK
ankita pandey
 
Adaptive Resonance Theory
Naveen Kumar
 
Ensemble learning
Haris Jamil
 
Ad

Similar to Perceptron & Neural Networks (20)

PPT
2011 0480.neural-networks
Parneet Kaur
 
PPT
SOFT COMPUTERING TECHNICS -Unit 1
sravanthi computers
 
PPTX
Chapter-5-Part I-Basics-Neural-Networks.pptx
MitikuAbebe2
 
PDF
Artificial Neural Network
ssuserab4f3e
 
PPT
neural networks
Institute of Technology Telkom
 
PPTX
Artificial Neural Networks for NIU session 2016 17
Prof. Neeta Awasthy
 
PDF
2013-1 Machine Learning Lecture 04 - Michael Negnevitsky - Artificial neur…
Dongseo University
 
PPT
Neural
Vaibhav Shah
 
PDF
10-Perceptron.pdf
ESTIBALYZJIMENEZCAST
 
PDF
Machine Learning- Perceptron_Backpropogation_Module 3.pdf
Dr. Shivashankar
 
PPT
Neuralnetwork 101222074552-phpapp02
Deepu Gupta
 
PPT
Soft Computering Technics - Unit2
sravanthi computers
 
PPT
Artificial Neural Networks-Supervised Learning Models
DrBaljitSinghKhehra
 
PPT
Artificial Neural Networks-Supervised Learning Models
DrBaljitSinghKhehra
 
PPT
Artificial Neural Networks-Supervised Learning Models
DrBaljitSinghKhehra
 
PPT
Neural network final NWU 4.3 Graphics Course
Mohaiminur Rahman
 
PPS
Neural Networks
Ismail El Gayar
 
PPTX
Neural network
marada0033
 
PPT
Neural networks
Arul Kumar
 
PDF
Neural Networks
Shahid Rajaee
 
2011 0480.neural-networks
Parneet Kaur
 
SOFT COMPUTERING TECHNICS -Unit 1
sravanthi computers
 
Chapter-5-Part I-Basics-Neural-Networks.pptx
MitikuAbebe2
 
Artificial Neural Network
ssuserab4f3e
 
Artificial Neural Networks for NIU session 2016 17
Prof. Neeta Awasthy
 
2013-1 Machine Learning Lecture 04 - Michael Negnevitsky - Artificial neur…
Dongseo University
 
Neural
Vaibhav Shah
 
10-Perceptron.pdf
ESTIBALYZJIMENEZCAST
 
Machine Learning- Perceptron_Backpropogation_Module 3.pdf
Dr. Shivashankar
 
Neuralnetwork 101222074552-phpapp02
Deepu Gupta
 
Soft Computering Technics - Unit2
sravanthi computers
 
Artificial Neural Networks-Supervised Learning Models
DrBaljitSinghKhehra
 
Artificial Neural Networks-Supervised Learning Models
DrBaljitSinghKhehra
 
Artificial Neural Networks-Supervised Learning Models
DrBaljitSinghKhehra
 
Neural network final NWU 4.3 Graphics Course
Mohaiminur Rahman
 
Neural Networks
Ismail El Gayar
 
Neural network
marada0033
 
Neural networks
Arul Kumar
 
Neural Networks
Shahid Rajaee
 
Ad

More from NAGUR SHAREEF SHAIK (10)

PPTX
Theories on moral autonomy
NAGUR SHAREEF SHAIK
 
PPTX
Internet of things(IoT)
NAGUR SHAREEF SHAIK
 
PPTX
Artificial Intelligence
NAGUR SHAREEF SHAIK
 
PPTX
Control statements
NAGUR SHAREEF SHAIK
 
PPTX
Quantum Computers
NAGUR SHAREEF SHAIK
 
PPTX
Biodiversity and its conservation
NAGUR SHAREEF SHAIK
 
PPTX
PHOTONIC CRYSTALS
NAGUR SHAREEF SHAIK
 
PPTX
Gravitational waves
NAGUR SHAREEF SHAIK
 
PPTX
WASTE WATER TREATMENT
NAGUR SHAREEF SHAIK
 
PPTX
Quantum computers
NAGUR SHAREEF SHAIK
 
Theories on moral autonomy
NAGUR SHAREEF SHAIK
 
Internet of things(IoT)
NAGUR SHAREEF SHAIK
 
Artificial Intelligence
NAGUR SHAREEF SHAIK
 
Control statements
NAGUR SHAREEF SHAIK
 
Quantum Computers
NAGUR SHAREEF SHAIK
 
Biodiversity and its conservation
NAGUR SHAREEF SHAIK
 
PHOTONIC CRYSTALS
NAGUR SHAREEF SHAIK
 
Gravitational waves
NAGUR SHAREEF SHAIK
 
WASTE WATER TREATMENT
NAGUR SHAREEF SHAIK
 
Quantum computers
NAGUR SHAREEF SHAIK
 

Recently uploaded (20)

PDF
Construction of a Thermal Vacuum Chamber for Environment Test of Triple CubeS...
2208441
 
PDF
67243-Cooling and Heating & Calculation.pdf
DHAKA POLYTECHNIC
 
PPTX
MT Chapter 1.pptx- Magnetic particle testing
ABCAnyBodyCanRelax
 
PDF
2025 Laurence Sigler - Advancing Decision Support. Content Management Ecommer...
Francisco Javier Mora Serrano
 
PDF
2010_Book_EnvironmentalBioengineering (1).pdf
EmilianoRodriguezTll
 
PDF
FLEX-LNG-Company-Presentation-Nov-2017.pdf
jbloggzs
 
PPTX
Civil Engineering Practices_BY Sh.JP Mishra 23.09.pptx
bineetmishra1990
 
PDF
EVS+PRESENTATIONS EVS+PRESENTATIONS like
saiyedaqib429
 
PDF
67243-Cooling and Heating & Calculation.pdf
DHAKA POLYTECHNIC
 
PDF
Introduction to Ship Engine Room Systems.pdf
Mahmoud Moghtaderi
 
PPTX
Victory Precisions_Supplier Profile.pptx
victoryprecisions199
 
PPTX
quantum computing transition from classical mechanics.pptx
gvlbcy
 
PDF
CAD-CAM U-1 Combined Notes_57761226_2025_04_22_14_40.pdf
shailendrapratap2002
 
PDF
Cryptography and Information :Security Fundamentals
Dr. Madhuri Jawale
 
PDF
settlement FOR FOUNDATION ENGINEERS.pdf
Endalkazene
 
PDF
Packaging Tips for Stainless Steel Tubes and Pipes
heavymetalsandtubes
 
PDF
Chad Ayach - A Versatile Aerospace Professional
Chad Ayach
 
PPT
Understanding the Key Components and Parts of a Drone System.ppt
Siva Reddy
 
PPTX
MULTI LEVEL DATA TRACKING USING COOJA.pptx
dollysharma12ab
 
PPTX
Module2 Data Base Design- ER and NF.pptx
gomathisankariv2
 
Construction of a Thermal Vacuum Chamber for Environment Test of Triple CubeS...
2208441
 
67243-Cooling and Heating & Calculation.pdf
DHAKA POLYTECHNIC
 
MT Chapter 1.pptx- Magnetic particle testing
ABCAnyBodyCanRelax
 
2025 Laurence Sigler - Advancing Decision Support. Content Management Ecommer...
Francisco Javier Mora Serrano
 
2010_Book_EnvironmentalBioengineering (1).pdf
EmilianoRodriguezTll
 
FLEX-LNG-Company-Presentation-Nov-2017.pdf
jbloggzs
 
Civil Engineering Practices_BY Sh.JP Mishra 23.09.pptx
bineetmishra1990
 
EVS+PRESENTATIONS EVS+PRESENTATIONS like
saiyedaqib429
 
67243-Cooling and Heating & Calculation.pdf
DHAKA POLYTECHNIC
 
Introduction to Ship Engine Room Systems.pdf
Mahmoud Moghtaderi
 
Victory Precisions_Supplier Profile.pptx
victoryprecisions199
 
quantum computing transition from classical mechanics.pptx
gvlbcy
 
CAD-CAM U-1 Combined Notes_57761226_2025_04_22_14_40.pdf
shailendrapratap2002
 
Cryptography and Information :Security Fundamentals
Dr. Madhuri Jawale
 
settlement FOR FOUNDATION ENGINEERS.pdf
Endalkazene
 
Packaging Tips for Stainless Steel Tubes and Pipes
heavymetalsandtubes
 
Chad Ayach - A Versatile Aerospace Professional
Chad Ayach
 
Understanding the Key Components and Parts of a Drone System.ppt
Siva Reddy
 
MULTI LEVEL DATA TRACKING USING COOJA.pptx
dollysharma12ab
 
Module2 Data Base Design- ER and NF.pptx
gomathisankariv2
 

Perceptron & Neural Networks

  • 1. 1 Perceptron and Neural Networks Shaik Nagur Shareef Dept. of CSE Vignan’s University Output Values Input Signals (External Stimuli)
  • 2. Contents Introduction Neural Networks Perceptron and Examples Types of NN Applications January 20, 2019 2Shaik Nagur Shareef
  • 3. Human Information processing system  Highly complex, nonlinear, and parallel computer.  Has the capability to organize its structural constituents, known as neurons, to perform certain computations.  Pattern recognition, Perception, and Motor control.  Many times faster than the fastest digital computer in existence today  Neurons are information-processing units in the human brain. January 20, 2019 3Shaik Nagur Shareef Interconnected Neurons forms a neural (nerve) net
  • 4. Human Nervous System  Viewed as a Three-stage system.  Central to the system is the brain, represented by the neural (nerve) net, which continually receives information, perceives it, and makes appropriate decisions.  The arrows pointing from pointing from left to right indicate the forward transmission.  The arrows pointing from right to left signify the presence of feedback in the system.  The receptors convert stimuli from the human body or the external environment into electrical impulses that convey information to the neural net.  The effectors convert electrical impulses generated by the neural net into discernible responses as system outputs. January 20, 2019 Shaik Nagur Shareef 4
  • 5. Machine Information Processing System  Neural Networks are made of Artificial Neurons,  A Neural Network is a machine that is designed to model the way in which the brain performs a particular task or function of interest.  The network is usually implemented by using electronic components or is simulated in software on a digital computer.  Neural Networks perform useful computations through a process of learning. January 20, 2019 Shaik Nagur Shareef 5
  • 6. Biological Neuron vs Machine Neuron January 20, 2019 Shaik Nagur Shareef 6
  • 7. Neural Network  A Neural Network is a massively parallel distributed processor made up of simple processing units that has a natural propensity for storing experiential knowledge and making it available for use.  It resembles the brain in two respects: 1. Knowledge is acquired by the network from its environment through a learning process. 2. Interneuron connection strengths, known as synaptic weights, are used to store the acquired knowledge.  The procedure used to perform the learning process is called a learning algorithm, the function of which is to modify the synaptic weights of the network in an orderly fashion to attain a desired design objective.  The modification of synaptic weights provides the traditional method for the design of neural networks. January 20, 2019 Shaik Nagur Shareef 7
  • 8. Neural Network January 20, 2019 Shaik Nagur Shareef 8
  • 9. Artificial Neuron - [McPi43]  In 1943, Warren McCulloch and Walter Pitts introduced one of the first artificial neurons [McPi43].  The main feature of their neuron model is that a weighted sum of input signals is compared to a threshold to determine the neuron output.  When the sum is greater than or equal to the threshold, the output is 1.  When the sum is less than the threshold, the output is 0.  Networks of these neurons in principle, compute any arithmetic or logical function.  The parameters of [McPi43] networks had to be designed, as no training method was available. January 20, 2019 Shaik Nagur Shareef 9
  • 10. Perceptron- [Rose58]  In the late 1950s, Frank Rosenblatt and several other researchers developed a class of neural networks called perceptrons.  The neurons in these networks were similar to those of McCulloch and Pitts.  Rosenblatt's key contribution was the introduction of a learning rule for training perceptron networks to solve pattern recognition problems [Rose58].  He proved that his learning rule will always converge to the correct network weights, if weights exist that solve the problem.  Learning was simple and automatic.  Examples of proper behavior were presented to the network, which learned from its mistakes.  The perceptron could even learn when initialized with random values for its weights and biases. January 20, 2019 Shaik Nagur Shareef 10
  • 11. Learning Rule  It is a Procedure for modifying the weights and biases of a network.  This procedure may also be referred to as a training algorithm.  The purpose of the learning rule is to train the network to perform some task.  There are many types of neural network learning rules.  They fall into three broad categories: 1. Supervised learning 2. Unsupervised learning 3. Reinforcement (or graded) learning January 20, 2019 Shaik Nagur Shareef 11
  • 12. Supervised Learning  In supervised learning, the learning rule is provided with a set of examples (the training set) of proper network behavior:  Here pq is an input to the network and tq is the corresponding correct (target) output.  As the inputs are applied to the network, the network outputs are compared to the targets.  The learning rule is then used to adjust the weights and biases of the network in order to move the network outputs closer to the targets.  The perceptron learning rule falls in this supervised learning category. January 20, 2019 Shaik Nagur Shareef 12
  • 13. Unsupervised Learning  In unsupervised learning, the weights and biases are modified in response to network inputs only.  There are no target outputs available.  At first glance this might seem to be impractical.  How can you train a network if you don't know what it is supposed to do?  Most of these algorithms perform some kind of clustering operation.  They learn to categorize the input patterns into a finite number of classes.  This is especially useful in such applications as vector quantization. January 20, 2019 Shaik Nagur Shareef 13
  • 14. Reinforcement Learning  Reinforcement learning is similar to supervised learning, except that, instead of being provided with the correct output for each network input, the algorithm is only given a grade.  The grade (or score) is a measure of the network performance over some sequence of inputs.  This type of learning is currently much less common than supervised learning.  It appears to be most suited to control system applications January 20, 2019 Shaik Nagur Shareef 14
  • 15. Perceptron Architecture January 20, 2019 Shaik Nagur Shareef 15
  • 16. Procedure  consider the network weight matrix   January 20, 2019 Shaik Nagur Shareef 16
  • 17. Activation Function(hardlim Function) January 20, 2019 Shaik Nagur Shareef 17 Step Function Sigmoid Function
  • 18. Single-Neuron Perceptron-Example   Assign Values to weights and bias  January 20, 2019 Shaik Nagur Shareef 18
  • 19. Single-Neuron Perceptron-Example   Therefore, the network output will be 1 for the region above and to the right of the decision boundary. January 20, 2019 Shaik Nagur Shareef 19
  • 20. Single-Neuron Perceptron-Example January 20, 2019 Shaik Nagur Shareef 20
  • 21. Logic Function: AND gate   January 20, 2019 Shaik Nagur Shareef 21
  • 22. Weight Update Rule January 20, 2019 Shaik Nagur Shareef 22
  • 23. Weight Update Rule January 20, 2019 Shaik Nagur Shareef 23
  • 24. Training Algorithms  Adjust neural network weights to map inputs to outputs.  Use a set of sample patterns where the desired output (given the inputs presented) is known.  The purpose is to learn to generalize Recognize features which are common to good and bad exemplars January 20, 2019 Shaik Nagur Shareef 24
  • 25. Training: Key Terms  Epoch: Presentation of the entire training set to the neural network.  In the case of the AND function an epoch consists of four sets of inputs being presented to the network (i.e. [0,0], [0,1], [1,0], [1,1])  Error: The error value is the amount by which the value output by the network differs from the target value.  For example, if we required the network to output 0 and it output a 1, then Error = -1  Online training: Update weights after each sample  Offline (batch training): Compute error over all samples Then update weights  Training: Backpropagation procedure Gradient descent strategy (usual problems)  Prediction: Compute outputs based on input vector & weights January 20, 2019 Shaik Nagur Shareef 25
  • 26. Gradient Descent Concept  Error: Sum of squares error of inputs with current weights  Compute rate of change of error w.r.t each weight Which weights have greatest effect on error? Effectively, partial derivatives of error w.r.t weights In turn, depend on other weights => chain rule  E = G(w) Error as function of weights  Find rate of change of error Follow steepest rate of change Change weights so that error is minimized January 20, 2019 Shaik Nagur Shareef 26
  • 27. Gradient Descent Algorithm Gradient-Descent(training_examples, ) Each training example is a pair of the form <(x1,…xn),t> where (x1,…,xn) is the vector of input values, and t is the target output value,  is the learning rate (e.g. 0.1)  Initialize each wi to some small random value  Until the termination condition is met, Do Initialize each wi to zero For each <(x1,…xn),t> in training_examples Do Input the instance (x1,…,xn) to the linear unit and compute the output o For each linear unit weight wi Do  wi= wi +  (t-o) xi For each linear unit weight wi Do wi=wi+wi January 20, 2019 Shaik Nagur Shareef 27
  • 28. Gradient Descent Algorithm January 20, 2019 Shaik Nagur Shareef 28 (w1,w2) (w1+w1,w2 +w2)
  • 29. Back-Propagation  A training procedure which allows multi-layer feedforward Neural Networks to be trained;  Can theoretically perform “any” input-output mapping;  Can learn to solve linearly inseparable problems.  For feed-forward networks: A continuous function can be differentiated allowing gradient-descent. Back-propagation is an example of a gradient-descent technique. January 20, 2019 Shaik Nagur Shareef 29 jiw  kjw  iy jyi j k j k
  • 30. Back-Propagation Algorithm  Initialize each wi to some small random value  Until the termination condition is met, Do For each training example <(x1,…xn),t> Do Input the instance (x1,…,xn) to the network and compute the network outputs ok For each output unit k k=ok(1-ok)(tk-ok) For each hidden unit h h=oh(1-oh) k wh,k k For each network weight w,j Do wi,j=wi,j+wi,j where wi,j=  j xi,j January 20, 2019 Shaik Nagur Shareef 30
  • 31. Types of Neural Networks January 20, 2019 Shaik Nagur Shareef 31 Standard NN Recurrent NNConvolutional NN
  • 32. Benefits of Neural Networks  Nonlinearity: An artificial neuron can be linear or nonlinear.  Input–Output Mapping: A popular paradigm of learning, called learning with a teacher, or supervised learning, involves modification of the synaptic weights of a neural network by applying a set of labelled training examples, or task examples.  Adaptivity: Neural networks have a built-in capability to adapt their synaptic weights to changes in the surrounding environment.  Evidential Response: In the context of pattern classification, a neural network can be designed to provide information not only about which particular pattern to select, but also about the confidence in the decision made.  Fault Tolerance: A neural network, implemented in hardware form, has the potential to be inherently fault tolerant, or capable of robust computation, in the sense that its performance degrades gracefully under adverse operating conditions. January 20, 2019 Shaik Nagur Shareef 32
  • 33. Applications Natural language Processing Optical Character Recognition Speech recognition Neural Machine Translation Video Classification Emotion Recognition Face Recognition Object Detection Image Classification January 20, 2019 Shaik Nagur Shareef 33