SlideShare a Scribd company logo
By,
B.Kohila ,MSc(IT),
Nadar saraswathi college of arts and science
 A neural network: A set of connected
input/output units where
 each connection has a weight associated with it
 Computer Programs
 Pattern detection and machine learning
algorithms
 Build predictive models from large databases
 Modeled on human nervous system
 Offshoot of AI
 McCulloch and Pitt
 Originally targeted
 Image Understanding, Human Learning,
Computer Speech
 Highly accurate predictive models for a
large number of
 different types of problems
Ease of use and deployment – poor
Connection between nodes
Number of units
Training level
Learning Capability
Model is built one record at a time
Classification by backpropacation
 Model
 Specified by weights
 Training modifies weights
 Complex models
 Fully interconnected
 Multiple hidden layers
 Or Gate Functionality Input 1 Input 2
Weighted Sum Output
0 0 0 0
0 1 1 1
1 0 1 1
1 1 2 1
 Backpropagation: A neural network learning
algorithm
 Started by psychologists and neurobiologists to
develop and
 test computational analogues of neurons
 During the learning phase, the network learns by
adjusting
 the weights so as to be able to predict the
correct class label
 of the input tuples
 Also referred to as connectionist learning due to
the
 connections between units
 Weakness
 Long training time
 Require a number of parameters typically best determined empirically,
e.g.,
 the network topology or “structure."
 Poor interpretability: Difficult to interpret the symbolic meaning behind
the
 learned weights and of “hidden units" in the network
 Strength
 High tolerance to noisy data
 Ability to classify untrained patterns
 Well-suited for continuous-valued inputs and outputs
 Successful on a wide array of real-world data
 Algorithms are inherently parallel
 Techniques have recently been developed for the extraction of rules
from
 trained neural networks
Classification by backpropacation
 The inputs to the network correspond to the attributes
measured for
 each training tuple
 Inputs are fed simultaneously into the units making up the
input layer
 They are then weighted and fed simultaneously to a
hidden layer
 The number of hidden layers is arbitrary, although usually
only one
 The weighted outputs of the last hidden layer are input to
units making
 up the output layer, which emits the network's prediction
 The network is feed-forward in that none of the weights
cycles back to
 an input unit or to an output unit of a previous layer
 First decide the network topology: # of units in the input layer, # of
 hidden layers (if > 1), # of units in each hidden layer, and # of units
in the
 output layer
 Normalizing the input values for each attribute measured in the
training
 tuples to [0.0—1.0]
 One input unit per domain value, each initialized to 0
 Output, if for classification and more than two classes, one output
unit per
 class is used
 Once a network has been trained and its accuracy is unacceptable,
 repeat the training process with a different network topology or a
different
 set of initial weights
 Updation of weights and biases
Case Updating
Epoch Updating
 Terminating condition
Weight changes are below threshold
Error rate / Misclassification rate is small
Number of epochs
 Efficiency of Backpropagation
O(|D| x w) – for each epoch
w – number of weights
 Rule extraction from networks: network pruning
 Simplify the network structure by removing weighted
links that have
 the least effect on the trained network
 Then perform link, unit, or activation value clustering
 The set of input and activation values are studied to
derive rules
 describing the relationship between the input and
hidden unit layers
 Sensitivity analysis: assess the impact that a given
input variable
 has on a network output. The knowledge gained from
this analysis
 can be represented in rules

More Related Content

What's hot (20)

PPTX
Adaptive resonance theory (art)
Ashutosh Tyagi
 
PPTX
ANN load forecasting
Dr Ashok Tiwari
 
PPTX
Fault identification and distance protection using ANN
Sourav Behera
 
PDF
Artificial Neural Networks Lect3: Neural Network Learning rules
Mohammed Bennamoun
 
PPTX
ARTIFICIAL NEURAL NETWORKS
AIMS Education
 
PPTX
Neural Networks
Nancy Bansal
 
PPT
Ann
vini89
 
PPT
Artificial Neural Networks - ANN
Mohamed Talaat
 
PDF
Artificial neural network for machine learning
grinu
 
PPTX
06 neurolab python
Tamer Ahmed Farrag, PhD
 
PDF
Fundamental, An Introduction to Neural Networks
Nelson Piedra
 
PPTX
Data Applied: Similarity
DataminingTools Inc
 
PDF
Machine Learning: Introduction to Neural Networks
Francesco Collova'
 
PDF
03 neural network
Tianlu Wang
 
PPTX
04 Multi-layer Feedforward Networks
Tamer Ahmed Farrag, PhD
 
PPTX
Backpropagation algo
noT yeT woRkiNg !! iM stiLl stUdYinG !!
 
PDF
Scale free network Visualiuzation
Harshit Srivastava
 
PDF
Lecture artificial neural networks and pattern recognition
Hưng Đặng
 
PPTX
Multi tasking learning
ShreyusPuthiyapurail
 
PPTX
Introduction to Neural networks (under graduate course) Lecture 9 of 9
Randa Elanwar
 
Adaptive resonance theory (art)
Ashutosh Tyagi
 
ANN load forecasting
Dr Ashok Tiwari
 
Fault identification and distance protection using ANN
Sourav Behera
 
Artificial Neural Networks Lect3: Neural Network Learning rules
Mohammed Bennamoun
 
ARTIFICIAL NEURAL NETWORKS
AIMS Education
 
Neural Networks
Nancy Bansal
 
Ann
vini89
 
Artificial Neural Networks - ANN
Mohamed Talaat
 
Artificial neural network for machine learning
grinu
 
06 neurolab python
Tamer Ahmed Farrag, PhD
 
Fundamental, An Introduction to Neural Networks
Nelson Piedra
 
Data Applied: Similarity
DataminingTools Inc
 
Machine Learning: Introduction to Neural Networks
Francesco Collova'
 
03 neural network
Tianlu Wang
 
04 Multi-layer Feedforward Networks
Tamer Ahmed Farrag, PhD
 
Scale free network Visualiuzation
Harshit Srivastava
 
Lecture artificial neural networks and pattern recognition
Hưng Đặng
 
Multi tasking learning
ShreyusPuthiyapurail
 
Introduction to Neural networks (under graduate course) Lecture 9 of 9
Randa Elanwar
 

Similar to Classification by backpropacation (20)

PPT
Back_propagation_algorithm.Back_propagation_algorithm.Back_propagation_algorithm
sureshkumarece1
 
PPT
2.5 backpropagation
Krish_ver2
 
PPTX
10 Backpropagation Algorithm for Neural Networks (1).pptx
SaifKhan703888
 
PDF
09 classadvanced
JoonyoungJayGwak
 
PDF
Classification by back propagation, multi layered feed forward neural network...
bihira aggrey
 
PPT
Chapter 9. Classification Advanced Methods.ppt
Subrata Kumer Paul
 
PPT
Chapter 09 classification advanced
Houw Liong The
 
PPT
Chapter 09 class advanced
Houw Liong The
 
PPT
Data Mining: Concepts and techniques classification _chapter 9 :advanced methods
Salah Amean
 
PPTX
Multi Layer Network
International Islamic University
 
PPT
INTRODUCTION TO ARTIFICIAL INTELLIGENCE.
SoumitraKundu4
 
PPT
Neural networks,Single Layer Feed Forward
RohiniRajaramPandian
 
PPT
Addvanced Classification Algorithmns in DWDM
MrSKanthiKiran
 
PPT
Advanced Concept of Classification - Data Mining
DrSoumadipGhosh1
 
PPT
Data Mining and Warehousing Concept and Techniques
AnilkumarBrahmane2
 
PPT
Artificial Neural Network
Pratik Aggarwal
 
PDF
Nural Network ppt presentation which help about nural
sayaleedeshmukh5
 
PPTX
ACUMENS ON NEURAL NET AKG 20 7 23.pptx
gnans Kgnanshek
 
PPTX
Artificial Neural Networks presentations
migob991
 
Back_propagation_algorithm.Back_propagation_algorithm.Back_propagation_algorithm
sureshkumarece1
 
2.5 backpropagation
Krish_ver2
 
10 Backpropagation Algorithm for Neural Networks (1).pptx
SaifKhan703888
 
09 classadvanced
JoonyoungJayGwak
 
Classification by back propagation, multi layered feed forward neural network...
bihira aggrey
 
Chapter 9. Classification Advanced Methods.ppt
Subrata Kumer Paul
 
Chapter 09 classification advanced
Houw Liong The
 
Chapter 09 class advanced
Houw Liong The
 
Data Mining: Concepts and techniques classification _chapter 9 :advanced methods
Salah Amean
 
INTRODUCTION TO ARTIFICIAL INTELLIGENCE.
SoumitraKundu4
 
Neural networks,Single Layer Feed Forward
RohiniRajaramPandian
 
Addvanced Classification Algorithmns in DWDM
MrSKanthiKiran
 
Advanced Concept of Classification - Data Mining
DrSoumadipGhosh1
 
Data Mining and Warehousing Concept and Techniques
AnilkumarBrahmane2
 
Artificial Neural Network
Pratik Aggarwal
 
Nural Network ppt presentation which help about nural
sayaleedeshmukh5
 
ACUMENS ON NEURAL NET AKG 20 7 23.pptx
gnans Kgnanshek
 
Artificial Neural Networks presentations
migob991
 
Ad

More from Siva Priya (12)

PPTX
source code metrics and other maintenance tools and techniques
Siva Priya
 
PPTX
Class properties
Siva Priya
 
PPTX
Planning the development process
Siva Priya
 
PPTX
recovery management with concurrent controls
Siva Priya
 
PPTX
Web technology
Siva Priya
 
PPTX
Retail of big data analytics
Siva Priya
 
PPTX
Deadlock and shadow paging
Siva Priya
 
PPT
Mobile IP
Siva Priya
 
PPTX
density based method and expectation maximization
Siva Priya
 
PPTX
Disk scheduling & Disk management
Siva Priya
 
PPTX
Routing algorithm
Siva Priya
 
PPTX
Servlets & jdbc
Siva Priya
 
source code metrics and other maintenance tools and techniques
Siva Priya
 
Class properties
Siva Priya
 
Planning the development process
Siva Priya
 
recovery management with concurrent controls
Siva Priya
 
Web technology
Siva Priya
 
Retail of big data analytics
Siva Priya
 
Deadlock and shadow paging
Siva Priya
 
Mobile IP
Siva Priya
 
density based method and expectation maximization
Siva Priya
 
Disk scheduling & Disk management
Siva Priya
 
Routing algorithm
Siva Priya
 
Servlets & jdbc
Siva Priya
 
Ad

Recently uploaded (20)

PPTX
mode_of_action_of_fungicides_final[1] (2).pptx
MrRABIRANJAN
 
PDF
RODENT PEST MANAGEMENT-converted-compressed.pdf
S.B.P.G. COLLEGE BARAGAON VARANASI
 
PPTX
Akshay tunneling .pptx_20250331_165945_0000.pptx
akshaythaker18
 
PDF
2025-06-10 TWDB Agency Updates & Legislative Outcomes
tagdpa
 
PDF
A young gas giant and hidden substructures in a protoplanetary disk
Sérgio Sacani
 
PPTX
Lamarckism is one of the earliest theories of evolution, proposed before Darw...
Laxman Khatal
 
PPT
Cell cycle,cell cycle checkpoint and control
DrMukeshRameshPimpli
 
PDF
Unit-5 ppt.pdf unit 5 organic chemistry 3
visionshukla007
 
PPTX
Q1 - W1 - D2 - Models of matter for science.pptx
RyanCudal3
 
PPTX
Qualification of DISSOLUTION TEST APPARATUS.pptx
shrutipandit17
 
PPTX
PEDIA IDS IN A GIST_6488b6b5-3152-4a4a-a943-20a56efddd43 (2).pptx
tdas83504
 
PPTX
Immunopharmaceuticals and microbial Application
xxkaira1
 
PDF
crestacean parasitim non chordates notes
S.B.P.G. COLLEGE BARAGAON VARANASI
 
PDF
The-Origin- of -Metazoa-vertebrates .ppt
S.B.P.G. COLLEGE BARAGAON VARANASI
 
PDF
Unit-3 ppt.pdf organic chemistry - 3 unit 3
visionshukla007
 
PPT
Introduction of animal physiology in vertebrates
S.B.P.G. COLLEGE BARAGAON VARANASI
 
PPTX
Envenomation AND ANIMAL BITES DETAILS.pptx
HARISH543351
 
PDF
Adding Geochemistry To Understand Recharge Areas - Kinney County, Texas - Jim...
Texas Alliance of Groundwater Districts
 
PDF
Insect Behaviour : Patterns And Determinants
SheikhArshaqAreeb
 
PDF
Carbon-richDustInjectedintotheInterstellarMediumbyGalacticWCBinaries Survives...
Sérgio Sacani
 
mode_of_action_of_fungicides_final[1] (2).pptx
MrRABIRANJAN
 
RODENT PEST MANAGEMENT-converted-compressed.pdf
S.B.P.G. COLLEGE BARAGAON VARANASI
 
Akshay tunneling .pptx_20250331_165945_0000.pptx
akshaythaker18
 
2025-06-10 TWDB Agency Updates & Legislative Outcomes
tagdpa
 
A young gas giant and hidden substructures in a protoplanetary disk
Sérgio Sacani
 
Lamarckism is one of the earliest theories of evolution, proposed before Darw...
Laxman Khatal
 
Cell cycle,cell cycle checkpoint and control
DrMukeshRameshPimpli
 
Unit-5 ppt.pdf unit 5 organic chemistry 3
visionshukla007
 
Q1 - W1 - D2 - Models of matter for science.pptx
RyanCudal3
 
Qualification of DISSOLUTION TEST APPARATUS.pptx
shrutipandit17
 
PEDIA IDS IN A GIST_6488b6b5-3152-4a4a-a943-20a56efddd43 (2).pptx
tdas83504
 
Immunopharmaceuticals and microbial Application
xxkaira1
 
crestacean parasitim non chordates notes
S.B.P.G. COLLEGE BARAGAON VARANASI
 
The-Origin- of -Metazoa-vertebrates .ppt
S.B.P.G. COLLEGE BARAGAON VARANASI
 
Unit-3 ppt.pdf organic chemistry - 3 unit 3
visionshukla007
 
Introduction of animal physiology in vertebrates
S.B.P.G. COLLEGE BARAGAON VARANASI
 
Envenomation AND ANIMAL BITES DETAILS.pptx
HARISH543351
 
Adding Geochemistry To Understand Recharge Areas - Kinney County, Texas - Jim...
Texas Alliance of Groundwater Districts
 
Insect Behaviour : Patterns And Determinants
SheikhArshaqAreeb
 
Carbon-richDustInjectedintotheInterstellarMediumbyGalacticWCBinaries Survives...
Sérgio Sacani
 

Classification by backpropacation

  • 1. By, B.Kohila ,MSc(IT), Nadar saraswathi college of arts and science
  • 2.  A neural network: A set of connected input/output units where  each connection has a weight associated with it  Computer Programs  Pattern detection and machine learning algorithms  Build predictive models from large databases  Modeled on human nervous system  Offshoot of AI  McCulloch and Pitt  Originally targeted  Image Understanding, Human Learning, Computer Speech
  • 3.  Highly accurate predictive models for a large number of  different types of problems Ease of use and deployment – poor Connection between nodes Number of units Training level Learning Capability Model is built one record at a time
  • 5.  Model  Specified by weights  Training modifies weights  Complex models  Fully interconnected  Multiple hidden layers
  • 6.  Or Gate Functionality Input 1 Input 2 Weighted Sum Output 0 0 0 0 0 1 1 1 1 0 1 1 1 1 2 1
  • 7.  Backpropagation: A neural network learning algorithm  Started by psychologists and neurobiologists to develop and  test computational analogues of neurons  During the learning phase, the network learns by adjusting  the weights so as to be able to predict the correct class label  of the input tuples  Also referred to as connectionist learning due to the  connections between units
  • 8.  Weakness  Long training time  Require a number of parameters typically best determined empirically, e.g.,  the network topology or “structure."  Poor interpretability: Difficult to interpret the symbolic meaning behind the  learned weights and of “hidden units" in the network  Strength  High tolerance to noisy data  Ability to classify untrained patterns  Well-suited for continuous-valued inputs and outputs  Successful on a wide array of real-world data  Algorithms are inherently parallel  Techniques have recently been developed for the extraction of rules from  trained neural networks
  • 10.  The inputs to the network correspond to the attributes measured for  each training tuple  Inputs are fed simultaneously into the units making up the input layer  They are then weighted and fed simultaneously to a hidden layer  The number of hidden layers is arbitrary, although usually only one  The weighted outputs of the last hidden layer are input to units making  up the output layer, which emits the network's prediction  The network is feed-forward in that none of the weights cycles back to  an input unit or to an output unit of a previous layer
  • 11.  First decide the network topology: # of units in the input layer, # of  hidden layers (if > 1), # of units in each hidden layer, and # of units in the  output layer  Normalizing the input values for each attribute measured in the training  tuples to [0.0—1.0]  One input unit per domain value, each initialized to 0  Output, if for classification and more than two classes, one output unit per  class is used  Once a network has been trained and its accuracy is unacceptable,  repeat the training process with a different network topology or a different  set of initial weights
  • 12.  Updation of weights and biases Case Updating Epoch Updating  Terminating condition Weight changes are below threshold Error rate / Misclassification rate is small Number of epochs  Efficiency of Backpropagation O(|D| x w) – for each epoch w – number of weights
  • 13.  Rule extraction from networks: network pruning  Simplify the network structure by removing weighted links that have  the least effect on the trained network  Then perform link, unit, or activation value clustering  The set of input and activation values are studied to derive rules  describing the relationship between the input and hidden unit layers  Sensitivity analysis: assess the impact that a given input variable  has on a network output. The knowledge gained from this analysis  can be represented in rules