SlideShare a Scribd company logo
3
Most read
21
Most read
22
Most read
CHAPTER 01
ROSENBLATT’S PERCEPTRON
CSC445: Neural Networks
Prof. Dr. Mostafa Gadal-Haqq M. Mostafa
Computer Science Department
Faculty of Computer & Information Sciences
AIN SHAMS UNIVERSITY
(most of figures in this presentation are copyrighted to Pearson Education, Inc.)
ASU-CSC445: Neural Networks Prof. Dr. Mostafa Gadal-Haqq
 Introduction
 The Perceptron
 The Perceptron Convergence Theorem
 Computer Experiment
 The Batch Perceptron Algorithm
2
Rosenblatt’s Perceptron
ASU-CSC445: Neural Networks Prof. Dr. Mostafa Gadal-Haqq 3
Introduction
 The Perceptron:
 the simplest form of a neural network.
 consists of a single neuron with adjustable synaptic
weights and bias.
 can be used to classify linearly Separable patterns;
patterns that lie on opposite sides of a hyperplane.
 is limited to perform pattern classification with only two
classes.
ASU-CSC445: Neural Networks Prof. Dr. Mostafa Gadal-Haqq
The Perceptron
 Linearly and nonlinearly separable classes.
Figure 1.4 (a) A pair of linearly separable patterns. (b) A pair of
non-linearly separable.
4
ASU-CSC445: Neural Networks Prof. Dr. Mostafa Gadal-Haqq
The Perceptron
 A nonlinear neuron that is consists of a linear
combiner followed by a hard limiter (e.g., signum
activation function)
 Weights are adapted using an error-correction rule.
Figure 1.3 Signal-flow graph of the perceptron.
5






01
01
)(
v
v
vy 



m
i
iixwv
0



m
i
ii bxwv
1
ASU-CSC445: Neural Networks Prof. Dr. Mostafa Gadal-Haqq
The Perceptron
 The decision boundary
 A hyperplane defined by
 For the perceptron to function
Properly, the two classes C1
And C2 must be linearly
Separable. Figure 1.2 Illustration of the hyperplane
(in this example, a straight line) as decision
boundary for a two-dimensional, two-class
pattern-classification problem.
6
0
1


m
i
ii bxw
ASU-CSC445: Neural Networks Prof. Dr. Mostafa Gadal-Haqq
The Perceptron Convergence Algorithm
 the fixed-increment convergence theorem for the
perceptron (Rosenblatt, 1962):
 Let the subsets of training vectors X1 and X2 be linearly
separable. Let the inputs presented to the perceptron originate
from these two subsets. The perceptron converges after
some noiterations, in the sense:
is a solution vector for no  nmax .
Proof is reading: Pages (82-83 of ch01, Haykin).
7
...)2()1()( 000  nnn www
ASU-CSC445: Neural Networks Prof. Dr. Mostafa Gadal-Haqq
The Perceptron Convergence Algorithm
 We derive the error-correction learning algorithm as follows:
We write the input signal, the weights, and the bias:
Then
The learning algorithms find a weight vector w such that:
8
 T
m nxnxnxn )(),...,()(,1)( 21x
  T
m nwnwnwnwn )(),...,(),(,)( 210w
       


m
i
T
ii nnnxnwnv
0
)( xw
C1orinput vecteveryfor0  xxw
T
C2orinput vecteveryfor0  xxw
T
ASU-CSC445: Neural Networks Prof. Dr. Mostafa Gadal-Haqq
The Perceptron Convergence Algorithm
The learning algorithms find a weight vector W such that:
9
C1orinput vecteveryfor0  xxw
T
C2orinput vecteveryfor0  xxw
T
ASU-CSC445: Neural Networks Prof. Dr. Mostafa Gadal-Haqq
The Perceptron Convergence Algorithm
 Given the subsets of training vectors X1 and X2 , then
the training problem is then to find a weight vector
W such that the previous two inequalities are
satisfied. This is achieved when updating the
weights as follows:
 The learning-rate parameter (n) is a positive
number which could be variable. For fixed , we
have fixed-increment learning rule.
 The algorithm converges if (n) is a positive value.
10
  C2x(n)and0)(if)()()()1(  nnnnnn
T
xwxww 
  C1x(n)and0)(if)()()()1(  nnnnnn
T
xwxww 
ASU-CSC445: Neural Networks Prof. Dr. Mostafa Gadal-Haqq
The Perceptron Convergence Algorithm
11
ASU-CSC445: Neural Networks Prof. Dr. Mostafa Gadal-Haqq
The Perceptron Convergence Algorithm
12
ASU-CSC445: Neural Networks Prof. Dr. Mostafa Gadal-Haqq
Perceptron and Bayes Classifier
 Bayes Classifier
Figure 1.6 Signal-flow graph
of Gaussian classifier.
13
ASU-CSC445: Neural Networks Prof. Dr. Mostafa Gadal-Haqq
Perceptron and Bayes Classifier
 Bayes Classifier
Figure 1.7 Two overlapping, one-dimensional Gaussian distributions.
14
ASU-CSC445: Neural Networks Prof. Dr. Mostafa Gadal-Haqq 15
The Batch Perceptron Algorithm
 We define the perceptron cost function as
 where H is the set of samples x misclassified by a
perceptron using w as its weight vector
 the cost function J(w) is differentiable with respect to
the weight vector w. Thus, differentiating J(w) with
respect to yields the gradient vector
 In the method of steepest descent, the adjustment to the
weight vector w at each time step of the algorithm is
applied in a direction opposite to the gradient vector .
ASU-CSC445: Neural Networks Prof. Dr. Mostafa Gadal-Haqq 16
The Batch Perceptron Algorithm
 Accordingly, the algorithm takes the form
 which embodies the batch perceptron algorithm for
computing the weight vector W.
 The algorithm is said to be of the “batch” kind
because at each time-step of the algorithm, a batch
of misclassified samples is used to compute the
adjustment
ASU-CSC445: Neural Networks Prof. Dr. Mostafa Gadal-Haqq 17
Batch Learning
 Presentation of all the N examples in the training
sample constitute one epoch.
 The cost function of the learning is defined by the
average error energy Eav
 The weights are updated epoch-by-epoch
 Advantages:
 Accurate estimation of the gradient vector.
 Parallelization of the learning process.
ASU-CSC445: Neural Networks Prof. Dr. Mostafa Gadal-Haqq
Computer Experiment: Pattern Classification
18
Figure 1.8 The double-moon classification problem.
ASU-CSC445: Neural Networks Prof. Dr. Mostafa Gadal-Haqq
Computer Experiment: Pattern Classification
19
Figure 1.9 Perceptron with the double-moon set at distance d = 1.
ASU-CSC445: Neural Networks Prof. Dr. Mostafa Gadal-Haqq
Computer Experiment: Pattern Classification
20
Figure 1.10 Perceptron with the double-moon set at distance d = -4.
•Problems:
•1.1, 1.4, and 1.5
•Computer Experiment
•1.6
Homework 1
21
Model building Through
Regression
Next Time
22

More Related Content

What's hot (20)

PPTX
Perceptron & Neural Networks
NAGUR SHAREEF SHAIK
 
PPTX
02 Fundamental Concepts of ANN
Tamer Ahmed Farrag, PhD
 
PPT
backpropagation in neural networks
Akash Goel
 
PPT
Artificial neural network
mustafa aadel
 
PPS
Neural Networks
Ismail El Gayar
 
PDF
Max net
Sandilya Sridhara
 
PPTX
Adaptive Resonance Theory
surat murthy
 
PPTX
Neuro-fuzzy systems
Sagar Ahire
 
PDF
Brief Introduction to Boltzmann Machine
Arunabha Saha
 
PPTX
Associative memory network
Dr. C.V. Suresh Babu
 
PPT
Artificial Neural Network seminar presentation using ppt.
Mohd Faiz
 
PPTX
Hopfield Networks
Kanchana Rani G
 
PPTX
Artifical Neural Network and its applications
Sangeeta Tiwari
 
PPTX
Deep neural networks
Si Haem
 
PPTX
HOPFIELD NETWORK
ankita pandey
 
PDF
Linear regression
MartinHogg9
 
PPTX
Simultaneous Smoothing and Sharpening of Color Images
Cristina Pérez Benito
 
PPT
Wavelet Based Feature Extraction Scheme Of Eeg Waveform
shan pri
 
PDF
Artificial Neural Networks Lect3: Neural Network Learning rules
Mohammed Bennamoun
 
Perceptron & Neural Networks
NAGUR SHAREEF SHAIK
 
02 Fundamental Concepts of ANN
Tamer Ahmed Farrag, PhD
 
backpropagation in neural networks
Akash Goel
 
Artificial neural network
mustafa aadel
 
Neural Networks
Ismail El Gayar
 
Adaptive Resonance Theory
surat murthy
 
Neuro-fuzzy systems
Sagar Ahire
 
Brief Introduction to Boltzmann Machine
Arunabha Saha
 
Associative memory network
Dr. C.V. Suresh Babu
 
Artificial Neural Network seminar presentation using ppt.
Mohd Faiz
 
Hopfield Networks
Kanchana Rani G
 
Artifical Neural Network and its applications
Sangeeta Tiwari
 
Deep neural networks
Si Haem
 
HOPFIELD NETWORK
ankita pandey
 
Linear regression
MartinHogg9
 
Simultaneous Smoothing and Sharpening of Color Images
Cristina Pérez Benito
 
Wavelet Based Feature Extraction Scheme Of Eeg Waveform
shan pri
 
Artificial Neural Networks Lect3: Neural Network Learning rules
Mohammed Bennamoun
 

Similar to Neural Networks: Rosenblatt's Perceptron (20)

PPTX
1.pptx
SwatiMahale4
 
PPTX
Understanding of neural network architecture
hinanoor13
 
PPSX
Perceptron (neural network)
EdutechLearners
 
PDF
lec-10-perceptron-upload.pdf
Antonio Espinosa
 
PPTX
Machine Learning and Deep learning algorithms
SHAAMILIRAJAKUMAR1
 
PDF
Neural Networks: Multilayer Perceptron
Mostafa G. M. Mostafa
 
PPTX
Perceptrons
umairali255
 
PPTX
Artificial neural networks - A gentle introduction to ANNS.pptx
AttaNox1
 
PPTX
Neural network 20161210_jintaekseo
JinTaek Seo
 
PPTX
Introduction to Neural Networks and Perceptron Learning Algorithm.pptx
Kayalvizhi A
 
PPTX
Perceptron.pptx
jatinmishra40
 
PPTX
An example of Machine Learning model.pptx
DrMTayyabChaudhry1
 
PPTX
Artificial Neural Networks (ANNs) focusing on the perceptron Algorithm.pptx
MDYasin34
 
PPTX
Perceptron kkkkkkkkkkkkkkkkkkkkkkkk.pptx
KassahunAwoke
 
PDF
Perceptron
VARUN KUMAR
 
PPTX
Updated_Lecture02 Perceptron-A first neural network.pptx
UzairAli65885
 
PDF
Lecture 2
M. Raihan
 
PDF
AI Lesson 38
Assistant Professor
 
PDF
Lesson 38
Avijit Kumar
 
DOC
Perceptron working
Zarnigar Altaf
 
1.pptx
SwatiMahale4
 
Understanding of neural network architecture
hinanoor13
 
Perceptron (neural network)
EdutechLearners
 
lec-10-perceptron-upload.pdf
Antonio Espinosa
 
Machine Learning and Deep learning algorithms
SHAAMILIRAJAKUMAR1
 
Neural Networks: Multilayer Perceptron
Mostafa G. M. Mostafa
 
Perceptrons
umairali255
 
Artificial neural networks - A gentle introduction to ANNS.pptx
AttaNox1
 
Neural network 20161210_jintaekseo
JinTaek Seo
 
Introduction to Neural Networks and Perceptron Learning Algorithm.pptx
Kayalvizhi A
 
Perceptron.pptx
jatinmishra40
 
An example of Machine Learning model.pptx
DrMTayyabChaudhry1
 
Artificial Neural Networks (ANNs) focusing on the perceptron Algorithm.pptx
MDYasin34
 
Perceptron kkkkkkkkkkkkkkkkkkkkkkkk.pptx
KassahunAwoke
 
Perceptron
VARUN KUMAR
 
Updated_Lecture02 Perceptron-A first neural network.pptx
UzairAli65885
 
Lecture 2
M. Raihan
 
AI Lesson 38
Assistant Professor
 
Lesson 38
Avijit Kumar
 
Perceptron working
Zarnigar Altaf
 
Ad

More from Mostafa G. M. Mostafa (20)

PDF
Csc446: Pattern Recognition
Mostafa G. M. Mostafa
 
PDF
CSC446: Pattern Recognition (LN8)
Mostafa G. M. Mostafa
 
PDF
CSC446: Pattern Recognition (LN7)
Mostafa G. M. Mostafa
 
PDF
CSC446: Pattern Recognition (LN6)
Mostafa G. M. Mostafa
 
PDF
CSC446: Pattern Recognition (LN5)
Mostafa G. M. Mostafa
 
PDF
CSC446: Pattern Recognition (LN4)
Mostafa G. M. Mostafa
 
PDF
CSC446: Pattern Recognition (LN3)
Mostafa G. M. Mostafa
 
PDF
Csc446: Pattren Recognition (LN2)
Mostafa G. M. Mostafa
 
PDF
Csc446: Pattren Recognition
Mostafa G. M. Mostafa
 
PDF
Csc446: Pattren Recognition (LN1)
Mostafa G. M. Mostafa
 
PDF
Digital Image Processing: Image Restoration
Mostafa G. M. Mostafa
 
PDF
Digital Image Processing: Image Segmentation
Mostafa G. M. Mostafa
 
PDF
Digital Image Processing: Image Enhancement in the Spatial Domain
Mostafa G. M. Mostafa
 
PDF
Digital Image Processing: Image Enhancement in the Frequency Domain
Mostafa G. M. Mostafa
 
PDF
Digital Image Processing: Digital Image Fundamentals
Mostafa G. M. Mostafa
 
PDF
Digital Image Processing: An Introduction
Mostafa G. M. Mostafa
 
PDF
Neural Networks: Least Mean Square (LSM) Algorithm
Mostafa G. M. Mostafa
 
PDF
Neural Networks: Support Vector machines
Mostafa G. M. Mostafa
 
PDF
Neural Networks: Self-Organizing Maps (SOM)
Mostafa G. M. Mostafa
 
PDF
Neural Networks: Principal Component Analysis (PCA)
Mostafa G. M. Mostafa
 
Csc446: Pattern Recognition
Mostafa G. M. Mostafa
 
CSC446: Pattern Recognition (LN8)
Mostafa G. M. Mostafa
 
CSC446: Pattern Recognition (LN7)
Mostafa G. M. Mostafa
 
CSC446: Pattern Recognition (LN6)
Mostafa G. M. Mostafa
 
CSC446: Pattern Recognition (LN5)
Mostafa G. M. Mostafa
 
CSC446: Pattern Recognition (LN4)
Mostafa G. M. Mostafa
 
CSC446: Pattern Recognition (LN3)
Mostafa G. M. Mostafa
 
Csc446: Pattren Recognition (LN2)
Mostafa G. M. Mostafa
 
Csc446: Pattren Recognition
Mostafa G. M. Mostafa
 
Csc446: Pattren Recognition (LN1)
Mostafa G. M. Mostafa
 
Digital Image Processing: Image Restoration
Mostafa G. M. Mostafa
 
Digital Image Processing: Image Segmentation
Mostafa G. M. Mostafa
 
Digital Image Processing: Image Enhancement in the Spatial Domain
Mostafa G. M. Mostafa
 
Digital Image Processing: Image Enhancement in the Frequency Domain
Mostafa G. M. Mostafa
 
Digital Image Processing: Digital Image Fundamentals
Mostafa G. M. Mostafa
 
Digital Image Processing: An Introduction
Mostafa G. M. Mostafa
 
Neural Networks: Least Mean Square (LSM) Algorithm
Mostafa G. M. Mostafa
 
Neural Networks: Support Vector machines
Mostafa G. M. Mostafa
 
Neural Networks: Self-Organizing Maps (SOM)
Mostafa G. M. Mostafa
 
Neural Networks: Principal Component Analysis (PCA)
Mostafa G. M. Mostafa
 
Ad

Recently uploaded (20)

PPTX
How to Track Skills & Contracts Using Odoo 18 Employee
Celine George
 
PPTX
TOP 10 AI TOOLS YOU MUST LEARN TO SURVIVE IN 2025 AND ABOVE
digilearnings.com
 
PPTX
Introduction to pediatric nursing in 5th Sem..pptx
AneetaSharma15
 
PDF
My Thoughts On Q&A- A Novel By Vikas Swarup
Niharika
 
PDF
The-Invisible-Living-World-Beyond-Our-Naked-Eye chapter 2.pdf/8th science cur...
Sandeep Swamy
 
PPTX
Artificial Intelligence in Gastroentrology: Advancements and Future Presprec...
AyanHossain
 
PPTX
Command Palatte in Odoo 18.1 Spreadsheet - Odoo Slides
Celine George
 
PPTX
Python-Application-in-Drug-Design by R D Jawarkar.pptx
Rahul Jawarkar
 
PPTX
LDP-2 UNIT 4 Presentation for practical.pptx
abhaypanchal2525
 
PPTX
Dakar Framework Education For All- 2000(Act)
santoshmohalik1
 
PPTX
K-Circle-Weekly-Quiz12121212-May2025.pptx
Pankaj Rodey
 
PPTX
PROTIEN ENERGY MALNUTRITION: NURSING MANAGEMENT.pptx
PRADEEP ABOTHU
 
PDF
Virat Kohli- the Pride of Indian cricket
kushpar147
 
PPTX
Electrophysiology_of_Heart. Electrophysiology studies in Cardiovascular syste...
Rajshri Ghogare
 
PPTX
Unlock the Power of Cursor AI: MuleSoft Integrations
Veera Pallapu
 
PPTX
Rules and Regulations of Madhya Pradesh Library Part-I
SantoshKumarKori2
 
PPTX
Applied-Statistics-1.pptx hardiba zalaaa
hardizala899
 
PPTX
YSPH VMOC Special Report - Measles Outbreak Southwest US 7-20-2025.pptx
Yale School of Public Health - The Virtual Medical Operations Center (VMOC)
 
PPTX
CONCEPT OF CHILD CARE. pptx
AneetaSharma15
 
PPT
DRUGS USED IN THERAPY OF SHOCK, Shock Therapy, Treatment or management of shock
Rajshri Ghogare
 
How to Track Skills & Contracts Using Odoo 18 Employee
Celine George
 
TOP 10 AI TOOLS YOU MUST LEARN TO SURVIVE IN 2025 AND ABOVE
digilearnings.com
 
Introduction to pediatric nursing in 5th Sem..pptx
AneetaSharma15
 
My Thoughts On Q&A- A Novel By Vikas Swarup
Niharika
 
The-Invisible-Living-World-Beyond-Our-Naked-Eye chapter 2.pdf/8th science cur...
Sandeep Swamy
 
Artificial Intelligence in Gastroentrology: Advancements and Future Presprec...
AyanHossain
 
Command Palatte in Odoo 18.1 Spreadsheet - Odoo Slides
Celine George
 
Python-Application-in-Drug-Design by R D Jawarkar.pptx
Rahul Jawarkar
 
LDP-2 UNIT 4 Presentation for practical.pptx
abhaypanchal2525
 
Dakar Framework Education For All- 2000(Act)
santoshmohalik1
 
K-Circle-Weekly-Quiz12121212-May2025.pptx
Pankaj Rodey
 
PROTIEN ENERGY MALNUTRITION: NURSING MANAGEMENT.pptx
PRADEEP ABOTHU
 
Virat Kohli- the Pride of Indian cricket
kushpar147
 
Electrophysiology_of_Heart. Electrophysiology studies in Cardiovascular syste...
Rajshri Ghogare
 
Unlock the Power of Cursor AI: MuleSoft Integrations
Veera Pallapu
 
Rules and Regulations of Madhya Pradesh Library Part-I
SantoshKumarKori2
 
Applied-Statistics-1.pptx hardiba zalaaa
hardizala899
 
YSPH VMOC Special Report - Measles Outbreak Southwest US 7-20-2025.pptx
Yale School of Public Health - The Virtual Medical Operations Center (VMOC)
 
CONCEPT OF CHILD CARE. pptx
AneetaSharma15
 
DRUGS USED IN THERAPY OF SHOCK, Shock Therapy, Treatment or management of shock
Rajshri Ghogare
 

Neural Networks: Rosenblatt's Perceptron

  • 1. CHAPTER 01 ROSENBLATT’S PERCEPTRON CSC445: Neural Networks Prof. Dr. Mostafa Gadal-Haqq M. Mostafa Computer Science Department Faculty of Computer & Information Sciences AIN SHAMS UNIVERSITY (most of figures in this presentation are copyrighted to Pearson Education, Inc.)
  • 2. ASU-CSC445: Neural Networks Prof. Dr. Mostafa Gadal-Haqq  Introduction  The Perceptron  The Perceptron Convergence Theorem  Computer Experiment  The Batch Perceptron Algorithm 2 Rosenblatt’s Perceptron
  • 3. ASU-CSC445: Neural Networks Prof. Dr. Mostafa Gadal-Haqq 3 Introduction  The Perceptron:  the simplest form of a neural network.  consists of a single neuron with adjustable synaptic weights and bias.  can be used to classify linearly Separable patterns; patterns that lie on opposite sides of a hyperplane.  is limited to perform pattern classification with only two classes.
  • 4. ASU-CSC445: Neural Networks Prof. Dr. Mostafa Gadal-Haqq The Perceptron  Linearly and nonlinearly separable classes. Figure 1.4 (a) A pair of linearly separable patterns. (b) A pair of non-linearly separable. 4
  • 5. ASU-CSC445: Neural Networks Prof. Dr. Mostafa Gadal-Haqq The Perceptron  A nonlinear neuron that is consists of a linear combiner followed by a hard limiter (e.g., signum activation function)  Weights are adapted using an error-correction rule. Figure 1.3 Signal-flow graph of the perceptron. 5       01 01 )( v v vy     m i iixwv 0    m i ii bxwv 1
  • 6. ASU-CSC445: Neural Networks Prof. Dr. Mostafa Gadal-Haqq The Perceptron  The decision boundary  A hyperplane defined by  For the perceptron to function Properly, the two classes C1 And C2 must be linearly Separable. Figure 1.2 Illustration of the hyperplane (in this example, a straight line) as decision boundary for a two-dimensional, two-class pattern-classification problem. 6 0 1   m i ii bxw
  • 7. ASU-CSC445: Neural Networks Prof. Dr. Mostafa Gadal-Haqq The Perceptron Convergence Algorithm  the fixed-increment convergence theorem for the perceptron (Rosenblatt, 1962):  Let the subsets of training vectors X1 and X2 be linearly separable. Let the inputs presented to the perceptron originate from these two subsets. The perceptron converges after some noiterations, in the sense: is a solution vector for no  nmax . Proof is reading: Pages (82-83 of ch01, Haykin). 7 ...)2()1()( 000  nnn www
  • 8. ASU-CSC445: Neural Networks Prof. Dr. Mostafa Gadal-Haqq The Perceptron Convergence Algorithm  We derive the error-correction learning algorithm as follows: We write the input signal, the weights, and the bias: Then The learning algorithms find a weight vector w such that: 8  T m nxnxnxn )(),...,()(,1)( 21x   T m nwnwnwnwn )(),...,(),(,)( 210w           m i T ii nnnxnwnv 0 )( xw C1orinput vecteveryfor0  xxw T C2orinput vecteveryfor0  xxw T
  • 9. ASU-CSC445: Neural Networks Prof. Dr. Mostafa Gadal-Haqq The Perceptron Convergence Algorithm The learning algorithms find a weight vector W such that: 9 C1orinput vecteveryfor0  xxw T C2orinput vecteveryfor0  xxw T
  • 10. ASU-CSC445: Neural Networks Prof. Dr. Mostafa Gadal-Haqq The Perceptron Convergence Algorithm  Given the subsets of training vectors X1 and X2 , then the training problem is then to find a weight vector W such that the previous two inequalities are satisfied. This is achieved when updating the weights as follows:  The learning-rate parameter (n) is a positive number which could be variable. For fixed , we have fixed-increment learning rule.  The algorithm converges if (n) is a positive value. 10   C2x(n)and0)(if)()()()1(  nnnnnn T xwxww    C1x(n)and0)(if)()()()1(  nnnnnn T xwxww 
  • 11. ASU-CSC445: Neural Networks Prof. Dr. Mostafa Gadal-Haqq The Perceptron Convergence Algorithm 11
  • 12. ASU-CSC445: Neural Networks Prof. Dr. Mostafa Gadal-Haqq The Perceptron Convergence Algorithm 12
  • 13. ASU-CSC445: Neural Networks Prof. Dr. Mostafa Gadal-Haqq Perceptron and Bayes Classifier  Bayes Classifier Figure 1.6 Signal-flow graph of Gaussian classifier. 13
  • 14. ASU-CSC445: Neural Networks Prof. Dr. Mostafa Gadal-Haqq Perceptron and Bayes Classifier  Bayes Classifier Figure 1.7 Two overlapping, one-dimensional Gaussian distributions. 14
  • 15. ASU-CSC445: Neural Networks Prof. Dr. Mostafa Gadal-Haqq 15 The Batch Perceptron Algorithm  We define the perceptron cost function as  where H is the set of samples x misclassified by a perceptron using w as its weight vector  the cost function J(w) is differentiable with respect to the weight vector w. Thus, differentiating J(w) with respect to yields the gradient vector  In the method of steepest descent, the adjustment to the weight vector w at each time step of the algorithm is applied in a direction opposite to the gradient vector .
  • 16. ASU-CSC445: Neural Networks Prof. Dr. Mostafa Gadal-Haqq 16 The Batch Perceptron Algorithm  Accordingly, the algorithm takes the form  which embodies the batch perceptron algorithm for computing the weight vector W.  The algorithm is said to be of the “batch” kind because at each time-step of the algorithm, a batch of misclassified samples is used to compute the adjustment
  • 17. ASU-CSC445: Neural Networks Prof. Dr. Mostafa Gadal-Haqq 17 Batch Learning  Presentation of all the N examples in the training sample constitute one epoch.  The cost function of the learning is defined by the average error energy Eav  The weights are updated epoch-by-epoch  Advantages:  Accurate estimation of the gradient vector.  Parallelization of the learning process.
  • 18. ASU-CSC445: Neural Networks Prof. Dr. Mostafa Gadal-Haqq Computer Experiment: Pattern Classification 18 Figure 1.8 The double-moon classification problem.
  • 19. ASU-CSC445: Neural Networks Prof. Dr. Mostafa Gadal-Haqq Computer Experiment: Pattern Classification 19 Figure 1.9 Perceptron with the double-moon set at distance d = 1.
  • 20. ASU-CSC445: Neural Networks Prof. Dr. Mostafa Gadal-Haqq Computer Experiment: Pattern Classification 20 Figure 1.10 Perceptron with the double-moon set at distance d = -4.
  • 21. •Problems: •1.1, 1.4, and 1.5 •Computer Experiment •1.6 Homework 1 21