SlideShare a Scribd company logo
Neural network and learning machines
Backpropagation for updating weights
Neural Network training steps
Weight Initialization
Inputs Application
Sum of inputs - Weights product
Activation functions
Weights Adaptations
Back to step 2
1
2
3
4
5
6
0 ≤ α ≤ 1
0 ≤  ≤ 1
Learning Rate 
First method:
Regarding 5th step: Weights Adaptation
Second method: Back propagation
Regarding 5th step: Weights Adaptation
Feedforward
Inputs
Outputs
Backward
 Fowrward VS Backword passes
The Backpropagation algorithm is a sensible
approach for dividing the contribution of each
weight.
Fowrward Input
weights
backward
SOP
Prediction
Output
Prediction
Error
Prediction
Error
Prediction
Output
SOP
Input
weights
Backward pass
Let us work with a simpler example
𝒚 = 𝒙𝟐 z+ c
How to answer this question: What is the effect on the output Y
given a change in variable X?
This question is answered using derivatives. Derivative of Y wrt X ( 𝜕𝑦ൗ 𝜕𝑥
) will tell us the effect of changing the variable X over the output Y.
Backward pass
Calculating the Derivatives
𝒚 = 𝒙𝟐 z+ c
The Derivative ( 𝜕𝑦ൗ 𝜕𝑥) can be calculated as
follow 𝝏
𝝏𝒙
𝒙𝟐 z+ c
Based on these two derivative rules:
Square Constant
𝝏
𝝏𝒙
𝒙𝟐 =2x
𝝏𝒙
𝝏
c=0
The Result will be :
𝝏
𝝏𝒙
𝒙𝟐 z+ c=2zx+0=2zx
Backward pass
Calculating the Derivative of prediction Error wrt Weights
𝟐
𝑬 =
𝟏
(𝒅𝒆𝒔𝒊𝒓𝒆𝒅 − 𝒑𝒓𝒆𝒅𝒊𝒄𝒕𝒆𝒅)𝟐
𝒑𝒓𝒆𝒅𝒊𝒄𝒕𝒆𝒅 = 𝒇(𝒔) =
𝟏
𝟏 + 𝒆−𝒔
𝒑𝒓𝒆𝒅𝒆𝒔𝒊𝒓𝒆𝒅 = 𝑪𝒐𝒏𝒔𝒕𝒂𝒏𝒕
𝟐
𝑬 =
𝟏
(𝒅𝒆𝒔𝒊𝒓𝒆𝒅 −
𝟏
𝟏 + 𝒆−𝒔
)𝟐
m
w j i  b i
s   x i
j
) 2
n
j
1
x i w i j
 b i
e  
E 
1
( d 
2
second method: Back propagation
Regarding 5th step: Weights Adaptation
 Backword pass
What is the change in prediction Error (E) given the change in weight (W) ?
Get partial derivative of E W.R.T W E
W
1
E  (d  y)2
2
d (desired output) Const
y ( predicted output)
s
1e
1
f (s) 
s (Sum Of Product SOP )
m
s   xi w ji  bi
j
) 2
n
j
1
x i w i j
 b i
e 
E 
1
( d 
2
w1, w2
second method: Back propagation
Regarding 5th step: Weights Adaptation
 Weight derivative
E 
1
(d  y)2
2
1
1es
y  f (s)  s  x1 w1  x2 w2  b w1, w2
W
E
y
E
s
y
w1 w2
s
,
s
Chain Rule
 w2 y s  w2
E

E
x
y
x
s
 w1 y s  w1
E

E
x
y
x
s
second method: Back propagation
Regarding 5th step: Weights Adaptation
 Weight derivative
 E


y y 2
1
(d  y ) 2
 y  d
)
(1
1 1
1 es
1 es

s 1 es
s
y

 1
1 1 2 2 1
1
1
w w
s


x wx w b  x 2
1 1 2 2
2
2
w w
s


x w  x w b  x
i
i
) x
(1
1 1
1 es
1 es
E
 (y  d)
 w
second method: Back propagation
Regarding 5th step: Weights Adaptation
 Update the Weights
In order to update the weights , use the Gradient Descent
f(w)
+ slop
w
Wnew= Wold - (+ve)
f(w)
- slop
w
Wnew= Wold - (-ve)
Example
Learning rate : 0.01
understanding Backpropagation neural networks

More Related Content

Similar to understanding Backpropagation neural networks (20)

PPTX
Back prop
Md Razaul Karim
 
PPTX
Backpropagation algo
noT yeT woRkiNg !! iM stiLl stUdYinG !!
 
PPTX
Back propagation method
Prof. Neeta Awasthy
 
PPT
Chapter No. 6: Backpropagation Networks
RamkrishnaPatil17
 
PDF
NN-Ch6.PDF
gnans Kgnanshek
 
PPT
nural network ER. Abhishek k. upadhyay
abhishek upadhyay
 
PPTX
CS532L4_Backpropagation.pptx
MFaisalRiaz5
 
PPTX
22PCOAM16_UNIT 2_ Session 12 Deriving Back-Propagation .pptx
Guru Nanak Technical Institutions
 
PPTX
Artificial neural network
Ildar Nurgaliev
 
PPTX
Backpropagation And Gradient Descent In Neural Networks | Neural Network Tuto...
Simplilearn
 
PDF
Back propagation
Bangalore
 
PPT
Classification using back propagation algorithm
KIRAN R
 
PDF
Lecture 5 backpropagation
ParveenMalik18
 
PPTX
Backpropagation.pptx
VandanaVipparthi
 
PPTX
ML_ Unit 2_Part_B
Srimatre K
 
PPT
Back propagation
Nagarajan
 
PDF
Machine Learning 1
cairo university
 
PDF
Classification by back propagation, multi layered feed forward neural network...
bihira aggrey
 
PPTX
Multilayer & Back propagation algorithm
swapnac12
 
PPTX
Neural network - how does it work - I mean... literally!
Christoph Diefenthal
 
Back prop
Md Razaul Karim
 
Back propagation method
Prof. Neeta Awasthy
 
Chapter No. 6: Backpropagation Networks
RamkrishnaPatil17
 
NN-Ch6.PDF
gnans Kgnanshek
 
nural network ER. Abhishek k. upadhyay
abhishek upadhyay
 
CS532L4_Backpropagation.pptx
MFaisalRiaz5
 
22PCOAM16_UNIT 2_ Session 12 Deriving Back-Propagation .pptx
Guru Nanak Technical Institutions
 
Artificial neural network
Ildar Nurgaliev
 
Backpropagation And Gradient Descent In Neural Networks | Neural Network Tuto...
Simplilearn
 
Back propagation
Bangalore
 
Classification using back propagation algorithm
KIRAN R
 
Lecture 5 backpropagation
ParveenMalik18
 
Backpropagation.pptx
VandanaVipparthi
 
ML_ Unit 2_Part_B
Srimatre K
 
Back propagation
Nagarajan
 
Machine Learning 1
cairo university
 
Classification by back propagation, multi layered feed forward neural network...
bihira aggrey
 
Multilayer & Back propagation algorithm
swapnac12
 
Neural network - how does it work - I mean... literally!
Christoph Diefenthal
 

Recently uploaded (20)

PDF
Set Relation Function Practice session 24.05.2025.pdf
DrStephenStrange4
 
PDF
Unified_Cloud_Comm_Presentation anil singh ppt
anilsingh298751
 
PPTX
Server Side Web Development Unit 1 of Nodejs.pptx
sneha852132
 
PDF
UNIT-4-FEEDBACK AMPLIFIERS AND OSCILLATORS (1).pdf
Sridhar191373
 
PPTX
ISO/IEC JTC 1/WG 9 (MAR) Convenor Report
Kurata Takeshi
 
PPTX
Heart Bleed Bug - A case study (Course: Cryptography and Network Security)
Adri Jovin
 
PDF
monopile foundation seminar topic for civil engineering students
Ahina5
 
PDF
MAD Unit - 2 Activity and Fragment Management in Android (Diploma IT)
JappanMavani
 
PPTX
Arduino Based Gas Leakage Detector Project
CircuitDigest
 
PPTX
The Role of Information Technology in Environmental Protectio....pptx
nallamillisriram
 
DOCX
CS-802 (A) BDH Lab manual IPS Academy Indore
thegodhimself05
 
PPTX
site survey architecture student B.arch.
sri02032006
 
PDF
Introduction to Productivity and Quality
মোঃ ফুরকান উদ্দিন জুয়েল
 
PPTX
Introduction to Design of Machine Elements
PradeepKumarS27
 
PPT
Oxygen Co2 Transport in the Lungs(Exchange og gases)
SUNDERLINSHIBUD
 
PPTX
EC3551-Transmission lines Demo class .pptx
Mahalakshmiprasannag
 
PPTX
Hashing Introduction , hash functions and techniques
sailajam21
 
PPTX
Pharmaceuticals and fine chemicals.pptxx
jaypa242004
 
PPTX
265587293-NFPA 101 Life safety code-PPT-1.pptx
chandermwason
 
PPTX
UNIT DAA PPT cover all topics 2021 regulation
archu26
 
Set Relation Function Practice session 24.05.2025.pdf
DrStephenStrange4
 
Unified_Cloud_Comm_Presentation anil singh ppt
anilsingh298751
 
Server Side Web Development Unit 1 of Nodejs.pptx
sneha852132
 
UNIT-4-FEEDBACK AMPLIFIERS AND OSCILLATORS (1).pdf
Sridhar191373
 
ISO/IEC JTC 1/WG 9 (MAR) Convenor Report
Kurata Takeshi
 
Heart Bleed Bug - A case study (Course: Cryptography and Network Security)
Adri Jovin
 
monopile foundation seminar topic for civil engineering students
Ahina5
 
MAD Unit - 2 Activity and Fragment Management in Android (Diploma IT)
JappanMavani
 
Arduino Based Gas Leakage Detector Project
CircuitDigest
 
The Role of Information Technology in Environmental Protectio....pptx
nallamillisriram
 
CS-802 (A) BDH Lab manual IPS Academy Indore
thegodhimself05
 
site survey architecture student B.arch.
sri02032006
 
Introduction to Productivity and Quality
মোঃ ফুরকান উদ্দিন জুয়েল
 
Introduction to Design of Machine Elements
PradeepKumarS27
 
Oxygen Co2 Transport in the Lungs(Exchange og gases)
SUNDERLINSHIBUD
 
EC3551-Transmission lines Demo class .pptx
Mahalakshmiprasannag
 
Hashing Introduction , hash functions and techniques
sailajam21
 
Pharmaceuticals and fine chemicals.pptxx
jaypa242004
 
265587293-NFPA 101 Life safety code-PPT-1.pptx
chandermwason
 
UNIT DAA PPT cover all topics 2021 regulation
archu26
 
Ad

understanding Backpropagation neural networks

  • 1. Neural network and learning machines Backpropagation for updating weights
  • 2. Neural Network training steps Weight Initialization Inputs Application Sum of inputs - Weights product Activation functions Weights Adaptations Back to step 2 1 2 3 4 5 6
  • 3. 0 ≤ α ≤ 1 0 ≤  ≤ 1 Learning Rate  First method: Regarding 5th step: Weights Adaptation
  • 4. Second method: Back propagation Regarding 5th step: Weights Adaptation Feedforward Inputs Outputs Backward  Fowrward VS Backword passes The Backpropagation algorithm is a sensible approach for dividing the contribution of each weight. Fowrward Input weights backward SOP Prediction Output Prediction Error Prediction Error Prediction Output SOP Input weights
  • 5. Backward pass Let us work with a simpler example 𝒚 = 𝒙𝟐 z+ c How to answer this question: What is the effect on the output Y given a change in variable X? This question is answered using derivatives. Derivative of Y wrt X ( 𝜕𝑦ൗ 𝜕𝑥 ) will tell us the effect of changing the variable X over the output Y.
  • 6. Backward pass Calculating the Derivatives 𝒚 = 𝒙𝟐 z+ c The Derivative ( 𝜕𝑦ൗ 𝜕𝑥) can be calculated as follow 𝝏 𝝏𝒙 𝒙𝟐 z+ c Based on these two derivative rules: Square Constant 𝝏 𝝏𝒙 𝒙𝟐 =2x 𝝏𝒙 𝝏 c=0 The Result will be : 𝝏 𝝏𝒙 𝒙𝟐 z+ c=2zx+0=2zx
  • 7. Backward pass Calculating the Derivative of prediction Error wrt Weights 𝟐 𝑬 = 𝟏 (𝒅𝒆𝒔𝒊𝒓𝒆𝒅 − 𝒑𝒓𝒆𝒅𝒊𝒄𝒕𝒆𝒅)𝟐 𝒑𝒓𝒆𝒅𝒊𝒄𝒕𝒆𝒅 = 𝒇(𝒔) = 𝟏 𝟏 + 𝒆−𝒔 𝒑𝒓𝒆𝒅𝒆𝒔𝒊𝒓𝒆𝒅 = 𝑪𝒐𝒏𝒔𝒕𝒂𝒏𝒕 𝟐 𝑬 = 𝟏 (𝒅𝒆𝒔𝒊𝒓𝒆𝒅 − 𝟏 𝟏 + 𝒆−𝒔 )𝟐 m w j i  b i s   x i j ) 2 n j 1 x i w i j  b i e   E  1 ( d  2
  • 8. second method: Back propagation Regarding 5th step: Weights Adaptation  Backword pass What is the change in prediction Error (E) given the change in weight (W) ? Get partial derivative of E W.R.T W E W 1 E  (d  y)2 2 d (desired output) Const y ( predicted output) s 1e 1 f (s)  s (Sum Of Product SOP ) m s   xi w ji  bi j ) 2 n j 1 x i w i j  b i e  E  1 ( d  2 w1, w2
  • 9. second method: Back propagation Regarding 5th step: Weights Adaptation  Weight derivative E  1 (d  y)2 2 1 1es y  f (s)  s  x1 w1  x2 w2  b w1, w2 W E y E s y w1 w2 s , s Chain Rule  w2 y s  w2 E  E x y x s  w1 y s  w1 E  E x y x s
  • 10. second method: Back propagation Regarding 5th step: Weights Adaptation  Weight derivative  E   y y 2 1 (d  y ) 2  y  d ) (1 1 1 1 es 1 es  s 1 es s y   1 1 1 2 2 1 1 1 w w s   x wx w b  x 2 1 1 2 2 2 2 w w s   x w  x w b  x i i ) x (1 1 1 1 es 1 es E  (y  d)  w
  • 11. second method: Back propagation Regarding 5th step: Weights Adaptation  Update the Weights In order to update the weights , use the Gradient Descent f(w) + slop w Wnew= Wold - (+ve) f(w) - slop w Wnew= Wold - (-ve)