SlideShare a Scribd company logo
1 of 3
Backpropagation<br />From Wikipedia, the free encyclopedia<br />Jump to: navigation, search<br />This article is about the computer algorithm. For the biological process, see Neural backpropagation.<br />Backpropagation, or propagation of error, is a common method of teaching artificial neural networks how to perform a given task. It was first described by Arthur E. Bryson and Yu-Chi Ho in 1969,[1][2] but it wasn't until 1986, through the work of David E. Rumelhart, Geoffrey E. Hinton and Ronald J. Williams, that it gained recognition, and it led to a “renaissance” in the field of artificial neural network research.<br />It is a supervised learning method, and is an implementation of the Delta rule. It requires a teacher that knows, or can calculate, the desired output for any given input. It is most useful for feed-forward networks (networks that have no feedback, or simply, that have no connections that loop). The term is an abbreviation for quot;
backwards propagation of errorsquot;
. Backpropagation requires that the activation function used by the artificial neurons (or quot;
nodesquot;
) is differentiable.<br />Contents[hide]1 Summary2 Algorithm3 Multithreaded Backpropagation4 Limitations5 References6 External links<br />[edit] Summary<br />Summary of the backpropagation technique:<br />Present a training sample to the neural network.<br />Compare the network's output to the desired output from that sample. Calculate the error in each output neuron.<br />For each neuron, calculate what the output should have been, and a scaling factor, how much lower or higher the output must be adjusted to match the desired output. This is the local error.<br />Adjust the weights of each neuron to lower the local error.<br />Assign quot;
blamequot;
 for the local error to neurons at the previous level, giving greater responsibility to neurons connected by stronger weights.<br />Repeat from step 3 on the neurons at the previous level, using each one's quot;
blamequot;
 as its error.<br />[edit] Algorithm<br />Actual algorithm for a 3-layer network (only one hidden layer):<br />  Initialize the weights in the network (often randomly)<br />  Do<br />         For each example e in the training set<br />              O = neural-net-output(network, e) ; forward pass<br />              T = teacher output for e<br />              Calculate error (T - O) at the output units<br />              Compute delta_wh for all weights from hidden layer to output layer ; backward pass<br />              Compute delta_wi for all weights from input layer to hidden layer ; backward pass continued<br />              Update the weights in the network<br />  Until all examples classified correctly or stopping criterion satisfied<br />  Return the network<br />As the algorithm's name implies, the errors (and therefore the learning) propagate backwards from the output nodes to the inner nodes. So technically speaking, backpropagation is used to calculate the gradient of the error of the network with respect to the network's modifiable weights. This gradient is almost always then used in a simple stochastic gradient descent algorithm to find weights that minimize the error. Often the term quot;
backpropagationquot;
 is used in a more general sense, to refer to the entire procedure encompassing both the calculation of the gradient and its use in stochastic gradient descent. Backpropagation usually allows quick convergence on satisfactory local minima for error in the kind of networks to which it is suited.<br />Backpropagation networks are necessarily multilayer perceptrons (usually with one input, one hidden, and one output layer). In order for the hidden layer to serve any useful function, multilayer networks must have non-linear activation functions for the multiple layers: a multilayer network using only linear activation functions is equivalent to some single layer, linear network. Non-linear activation functions that are commonly used include the logistic function, the softmax function, and the gaussian function.<br />The backpropagation algorithm for calculating a gradient has been rediscovered a number of times, and is a special case of a more general technique called automatic differentiation in the reverse accumulation mode.<br />It is also closely related to the Gauss–Newton algorithm, and is also part of continuing research in neural backpropagation.<br />[edit] Multithreaded Backpropagation<br />Backpropagation is an iterative process that can often take a great deal of time to complete. When multicore computers are used multithreaded techniques can greatly decrease the amount of time that backpropagation takes to converge. If batching is being used, it is relatively simple to adapt the backpropagation algorithm to operate in a multithreaded manor.<br />The training data is broken up into equally large batches for each of the threads. Each thread executes the forward and backward passes. The weight and threshold deltas are summed for each of the threads. At the end of each iteration all threads must pause briefly for the weight and threshold deltas to be summed and applied to the neural network. This process continues for each iteration. This multithreaded approach to backpropagation is used by the Encog Neural Network Framework. [3]<br />[edit] Limitations<br />The convergence obtained from backpropagation learning is very slow.<br />The convergence in backpropagation learning is not guaranteed.<br />The result may generally converge to any local minimum on the error surface, since stochastic gradient descent exists on a non-linear surface.<br />The backpropagation learning is associated with the problem of scaling.<br />
Backpropagation
Backpropagation

More Related Content

What's hot

Back propagation
Back propagationBack propagation
Back propagationNagarajan
 
Classification by back propagation, multi layered feed forward neural network...
Classification by back propagation, multi layered feed forward neural network...Classification by back propagation, multi layered feed forward neural network...
Classification by back propagation, multi layered feed forward neural network...bihira aggrey
 
Principles of soft computing-Associative memory networks
Principles of soft computing-Associative memory networksPrinciples of soft computing-Associative memory networks
Principles of soft computing-Associative memory networksSivagowry Shathesh
 
Back propagation network
Back propagation networkBack propagation network
Back propagation networkHIRA Zaidi
 
On Implementation of Neuron Network(Back-propagation)
On Implementation of Neuron Network(Back-propagation)On Implementation of Neuron Network(Back-propagation)
On Implementation of Neuron Network(Back-propagation)Yu Liu
 
Classification By Back Propagation
Classification By Back PropagationClassification By Back Propagation
Classification By Back PropagationBineeshJose99
 
The Art Of Backpropagation
The Art Of BackpropagationThe Art Of Backpropagation
The Art Of BackpropagationJennifer Prendki
 
Neural net and back propagation
Neural net and back propagationNeural net and back propagation
Neural net and back propagationMohit Shrivastava
 
15 Machine Learning Multilayer Perceptron
15 Machine Learning Multilayer Perceptron15 Machine Learning Multilayer Perceptron
15 Machine Learning Multilayer PerceptronAndres Mendez-Vazquez
 
Artificial neural network
Artificial neural networkArtificial neural network
Artificial neural networkDEEPASHRI HK
 
Feedforward neural network
Feedforward neural networkFeedforward neural network
Feedforward neural networkSopheaktra YONG
 
Adaline madaline
Adaline madalineAdaline madaline
Adaline madalineNagarajan
 
soft computing
soft computingsoft computing
soft computingAMIT KUMAR
 
MPerceptron
MPerceptronMPerceptron
MPerceptronbutest
 
Mitchell's Face Recognition
Mitchell's Face RecognitionMitchell's Face Recognition
Mitchell's Face Recognitionbutest
 
lecture07.ppt
lecture07.pptlecture07.ppt
lecture07.pptbutest
 
Multi-Layer Perceptrons
Multi-Layer PerceptronsMulti-Layer Perceptrons
Multi-Layer PerceptronsESCOM
 

What's hot (20)

Back propagation
Back propagationBack propagation
Back propagation
 
Classification by back propagation, multi layered feed forward neural network...
Classification by back propagation, multi layered feed forward neural network...Classification by back propagation, multi layered feed forward neural network...
Classification by back propagation, multi layered feed forward neural network...
 
04 Multi-layer Feedforward Networks
04 Multi-layer Feedforward Networks04 Multi-layer Feedforward Networks
04 Multi-layer Feedforward Networks
 
Principles of soft computing-Associative memory networks
Principles of soft computing-Associative memory networksPrinciples of soft computing-Associative memory networks
Principles of soft computing-Associative memory networks
 
Back propagation network
Back propagation networkBack propagation network
Back propagation network
 
On Implementation of Neuron Network(Back-propagation)
On Implementation of Neuron Network(Back-propagation)On Implementation of Neuron Network(Back-propagation)
On Implementation of Neuron Network(Back-propagation)
 
Classification By Back Propagation
Classification By Back PropagationClassification By Back Propagation
Classification By Back Propagation
 
The Art Of Backpropagation
The Art Of BackpropagationThe Art Of Backpropagation
The Art Of Backpropagation
 
Neural net and back propagation
Neural net and back propagationNeural net and back propagation
Neural net and back propagation
 
15 Machine Learning Multilayer Perceptron
15 Machine Learning Multilayer Perceptron15 Machine Learning Multilayer Perceptron
15 Machine Learning Multilayer Perceptron
 
Artificial neural network
Artificial neural networkArtificial neural network
Artificial neural network
 
Feedforward neural network
Feedforward neural networkFeedforward neural network
Feedforward neural network
 
Backpropagation algo
Backpropagation  algoBackpropagation  algo
Backpropagation algo
 
Adaline madaline
Adaline madalineAdaline madaline
Adaline madaline
 
soft computing
soft computingsoft computing
soft computing
 
MPerceptron
MPerceptronMPerceptron
MPerceptron
 
Mitchell's Face Recognition
Mitchell's Face RecognitionMitchell's Face Recognition
Mitchell's Face Recognition
 
lecture07.ppt
lecture07.pptlecture07.ppt
lecture07.ppt
 
Multi-Layer Perceptrons
Multi-Layer PerceptronsMulti-Layer Perceptrons
Multi-Layer Perceptrons
 
Perceptron
PerceptronPerceptron
Perceptron
 

Viewers also liked

Neural network & its applications
Neural network & its applications Neural network & its applications
Neural network & its applications Ahmed_hashmi
 
Artificial Neural Networks Lect5: Multi-Layer Perceptron & Backpropagation
Artificial Neural Networks Lect5: Multi-Layer Perceptron & BackpropagationArtificial Neural Networks Lect5: Multi-Layer Perceptron & Backpropagation
Artificial Neural Networks Lect5: Multi-Layer Perceptron & BackpropagationMohammed Bennamoun
 
neural network
neural networkneural network
neural networkSTUDENT
 
Hand Written Character Recognition Using Neural Networks
Hand Written Character Recognition Using Neural Networks Hand Written Character Recognition Using Neural Networks
Hand Written Character Recognition Using Neural Networks Chiranjeevi Adi
 
The Back Propagation Learning Algorithm
The Back Propagation Learning AlgorithmThe Back Propagation Learning Algorithm
The Back Propagation Learning AlgorithmESCOM
 
Artificial Neural Networks
Artificial Neural NetworksArtificial Neural Networks
Artificial Neural Networksguestac67362
 
MTech - AI_NeuralNetworks_Assignment
MTech - AI_NeuralNetworks_AssignmentMTech - AI_NeuralNetworks_Assignment
MTech - AI_NeuralNetworks_AssignmentVijayananda Mohire
 
Paper Reading : Learning to compose neural networks for question answering
Paper Reading : Learning to compose neural networks for question answeringPaper Reading : Learning to compose neural networks for question answering
Paper Reading : Learning to compose neural networks for question answeringSean Park
 
NLP_Project_Paper_up276_vec241
NLP_Project_Paper_up276_vec241NLP_Project_Paper_up276_vec241
NLP_Project_Paper_up276_vec241Urjit Patel
 
Basic Electrical Engineering
Basic Electrical EngineeringBasic Electrical Engineering
Basic Electrical EngineeringMathankumar S
 
Neural network for machine learning
Neural network for machine learningNeural network for machine learning
Neural network for machine learningUjjawal
 
Filtering an image is to apply a convolution
Filtering an image is to apply a convolutionFiltering an image is to apply a convolution
Filtering an image is to apply a convolutionAbhishek Mukherjee
 
RAIN WATER HARVESTING
RAIN WATER HARVESTING RAIN WATER HARVESTING
RAIN WATER HARVESTING Mathankumar S
 
Digital image processing - Image Enhancement (MATERIAL)
Digital image processing  - Image Enhancement (MATERIAL)Digital image processing  - Image Enhancement (MATERIAL)
Digital image processing - Image Enhancement (MATERIAL)Mathankumar S
 

Viewers also liked (20)

Neural network & its applications
Neural network & its applications Neural network & its applications
Neural network & its applications
 
Artificial Neural Networks Lect5: Multi-Layer Perceptron & Backpropagation
Artificial Neural Networks Lect5: Multi-Layer Perceptron & BackpropagationArtificial Neural Networks Lect5: Multi-Layer Perceptron & Backpropagation
Artificial Neural Networks Lect5: Multi-Layer Perceptron & Backpropagation
 
Hopfield Networks
Hopfield NetworksHopfield Networks
Hopfield Networks
 
HOPFIELD NETWORK
HOPFIELD NETWORKHOPFIELD NETWORK
HOPFIELD NETWORK
 
neural network
neural networkneural network
neural network
 
Hand Written Character Recognition Using Neural Networks
Hand Written Character Recognition Using Neural Networks Hand Written Character Recognition Using Neural Networks
Hand Written Character Recognition Using Neural Networks
 
The Back Propagation Learning Algorithm
The Back Propagation Learning AlgorithmThe Back Propagation Learning Algorithm
The Back Propagation Learning Algorithm
 
Artificial Neural Networks
Artificial Neural NetworksArtificial Neural Networks
Artificial Neural Networks
 
Pattern recognition
Pattern recognitionPattern recognition
Pattern recognition
 
MTech - AI_NeuralNetworks_Assignment
MTech - AI_NeuralNetworks_AssignmentMTech - AI_NeuralNetworks_Assignment
MTech - AI_NeuralNetworks_Assignment
 
Hamming
HammingHamming
Hamming
 
Max net
Max netMax net
Max net
 
Neural network
Neural networkNeural network
Neural network
 
Paper Reading : Learning to compose neural networks for question answering
Paper Reading : Learning to compose neural networks for question answeringPaper Reading : Learning to compose neural networks for question answering
Paper Reading : Learning to compose neural networks for question answering
 
NLP_Project_Paper_up276_vec241
NLP_Project_Paper_up276_vec241NLP_Project_Paper_up276_vec241
NLP_Project_Paper_up276_vec241
 
Basic Electrical Engineering
Basic Electrical EngineeringBasic Electrical Engineering
Basic Electrical Engineering
 
Neural network for machine learning
Neural network for machine learningNeural network for machine learning
Neural network for machine learning
 
Filtering an image is to apply a convolution
Filtering an image is to apply a convolutionFiltering an image is to apply a convolution
Filtering an image is to apply a convolution
 
RAIN WATER HARVESTING
RAIN WATER HARVESTING RAIN WATER HARVESTING
RAIN WATER HARVESTING
 
Digital image processing - Image Enhancement (MATERIAL)
Digital image processing  - Image Enhancement (MATERIAL)Digital image processing  - Image Enhancement (MATERIAL)
Digital image processing - Image Enhancement (MATERIAL)
 

Similar to Backpropagation

Artificial neural network
Artificial neural networkArtificial neural network
Artificial neural networkmustafa aadel
 
Introduction Of Artificial neural network
Introduction Of Artificial neural networkIntroduction Of Artificial neural network
Introduction Of Artificial neural networkNagarajan
 
ML Module 3 Non Linear Learning.pptx
ML Module 3 Non Linear Learning.pptxML Module 3 Non Linear Learning.pptx
ML Module 3 Non Linear Learning.pptxDebabrataPain1
 
Web spam classification using supervised artificial neural network algorithms
Web spam classification using supervised artificial neural network algorithmsWeb spam classification using supervised artificial neural network algorithms
Web spam classification using supervised artificial neural network algorithmsaciijournal
 
Web Spam Classification Using Supervised Artificial Neural Network Algorithms
Web Spam Classification Using Supervised Artificial Neural Network AlgorithmsWeb Spam Classification Using Supervised Artificial Neural Network Algorithms
Web Spam Classification Using Supervised Artificial Neural Network Algorithmsaciijournal
 
Intro to Deep learning - Autoencoders
Intro to Deep learning - Autoencoders Intro to Deep learning - Autoencoders
Intro to Deep learning - Autoencoders Akash Goel
 
International Refereed Journal of Engineering and Science (IRJES)
International Refereed Journal of Engineering and Science (IRJES)International Refereed Journal of Engineering and Science (IRJES)
International Refereed Journal of Engineering and Science (IRJES)irjes
 
Economic Load Dispatch (ELD), Economic Emission Dispatch (EED), Combined Econ...
Economic Load Dispatch (ELD), Economic Emission Dispatch (EED), Combined Econ...Economic Load Dispatch (ELD), Economic Emission Dispatch (EED), Combined Econ...
Economic Load Dispatch (ELD), Economic Emission Dispatch (EED), Combined Econ...cscpconf
 
Neural network based numerical digits recognization using nnt in matlab
Neural network based numerical digits recognization using nnt in matlabNeural network based numerical digits recognization using nnt in matlab
Neural network based numerical digits recognization using nnt in matlabijcses
 
Modeling of neural image compression using gradient decent technology
Modeling of neural image compression using gradient decent technologyModeling of neural image compression using gradient decent technology
Modeling of neural image compression using gradient decent technologytheijes
 
Artifical Neural Network and its applications
Artifical Neural Network and its applicationsArtifical Neural Network and its applications
Artifical Neural Network and its applicationsSangeeta Tiwari
 

Similar to Backpropagation (20)

Terminology Machine Learning
Terminology Machine LearningTerminology Machine Learning
Terminology Machine Learning
 
Unit ii supervised ii
Unit ii supervised iiUnit ii supervised ii
Unit ii supervised ii
 
Artificial neural network
Artificial neural networkArtificial neural network
Artificial neural network
 
Backpropagation.pptx
Backpropagation.pptxBackpropagation.pptx
Backpropagation.pptx
 
Introduction Of Artificial neural network
Introduction Of Artificial neural networkIntroduction Of Artificial neural network
Introduction Of Artificial neural network
 
N ns 1
N ns 1N ns 1
N ns 1
 
ML Module 3 Non Linear Learning.pptx
ML Module 3 Non Linear Learning.pptxML Module 3 Non Linear Learning.pptx
ML Module 3 Non Linear Learning.pptx
 
20120140503023
2012014050302320120140503023
20120140503023
 
Web spam classification using supervised artificial neural network algorithms
Web spam classification using supervised artificial neural network algorithmsWeb spam classification using supervised artificial neural network algorithms
Web spam classification using supervised artificial neural network algorithms
 
Web Spam Classification Using Supervised Artificial Neural Network Algorithms
Web Spam Classification Using Supervised Artificial Neural Network AlgorithmsWeb Spam Classification Using Supervised Artificial Neural Network Algorithms
Web Spam Classification Using Supervised Artificial Neural Network Algorithms
 
Ffnn
FfnnFfnn
Ffnn
 
Artificial Neural networks
Artificial Neural networksArtificial Neural networks
Artificial Neural networks
 
Intro to Deep learning - Autoencoders
Intro to Deep learning - Autoencoders Intro to Deep learning - Autoencoders
Intro to Deep learning - Autoencoders
 
International Refereed Journal of Engineering and Science (IRJES)
International Refereed Journal of Engineering and Science (IRJES)International Refereed Journal of Engineering and Science (IRJES)
International Refereed Journal of Engineering and Science (IRJES)
 
MNN
MNNMNN
MNN
 
Economic Load Dispatch (ELD), Economic Emission Dispatch (EED), Combined Econ...
Economic Load Dispatch (ELD), Economic Emission Dispatch (EED), Combined Econ...Economic Load Dispatch (ELD), Economic Emission Dispatch (EED), Combined Econ...
Economic Load Dispatch (ELD), Economic Emission Dispatch (EED), Combined Econ...
 
Neural network
Neural networkNeural network
Neural network
 
Neural network based numerical digits recognization using nnt in matlab
Neural network based numerical digits recognization using nnt in matlabNeural network based numerical digits recognization using nnt in matlab
Neural network based numerical digits recognization using nnt in matlab
 
Modeling of neural image compression using gradient decent technology
Modeling of neural image compression using gradient decent technologyModeling of neural image compression using gradient decent technology
Modeling of neural image compression using gradient decent technology
 
Artifical Neural Network and its applications
Artifical Neural Network and its applicationsArtifical Neural Network and its applications
Artifical Neural Network and its applications
 

Backpropagation

  • 1. Backpropagation<br />From Wikipedia, the free encyclopedia<br />Jump to: navigation, search<br />This article is about the computer algorithm. For the biological process, see Neural backpropagation.<br />Backpropagation, or propagation of error, is a common method of teaching artificial neural networks how to perform a given task. It was first described by Arthur E. Bryson and Yu-Chi Ho in 1969,[1][2] but it wasn't until 1986, through the work of David E. Rumelhart, Geoffrey E. Hinton and Ronald J. Williams, that it gained recognition, and it led to a “renaissance” in the field of artificial neural network research.<br />It is a supervised learning method, and is an implementation of the Delta rule. It requires a teacher that knows, or can calculate, the desired output for any given input. It is most useful for feed-forward networks (networks that have no feedback, or simply, that have no connections that loop). The term is an abbreviation for quot; backwards propagation of errorsquot; . Backpropagation requires that the activation function used by the artificial neurons (or quot; nodesquot; ) is differentiable.<br />Contents[hide]1 Summary2 Algorithm3 Multithreaded Backpropagation4 Limitations5 References6 External links<br />[edit] Summary<br />Summary of the backpropagation technique:<br />Present a training sample to the neural network.<br />Compare the network's output to the desired output from that sample. Calculate the error in each output neuron.<br />For each neuron, calculate what the output should have been, and a scaling factor, how much lower or higher the output must be adjusted to match the desired output. This is the local error.<br />Adjust the weights of each neuron to lower the local error.<br />Assign quot; blamequot; for the local error to neurons at the previous level, giving greater responsibility to neurons connected by stronger weights.<br />Repeat from step 3 on the neurons at the previous level, using each one's quot; blamequot; as its error.<br />[edit] Algorithm<br />Actual algorithm for a 3-layer network (only one hidden layer):<br /> Initialize the weights in the network (often randomly)<br /> Do<br /> For each example e in the training set<br /> O = neural-net-output(network, e) ; forward pass<br /> T = teacher output for e<br /> Calculate error (T - O) at the output units<br /> Compute delta_wh for all weights from hidden layer to output layer ; backward pass<br /> Compute delta_wi for all weights from input layer to hidden layer ; backward pass continued<br /> Update the weights in the network<br /> Until all examples classified correctly or stopping criterion satisfied<br /> Return the network<br />As the algorithm's name implies, the errors (and therefore the learning) propagate backwards from the output nodes to the inner nodes. So technically speaking, backpropagation is used to calculate the gradient of the error of the network with respect to the network's modifiable weights. This gradient is almost always then used in a simple stochastic gradient descent algorithm to find weights that minimize the error. Often the term quot; backpropagationquot; is used in a more general sense, to refer to the entire procedure encompassing both the calculation of the gradient and its use in stochastic gradient descent. Backpropagation usually allows quick convergence on satisfactory local minima for error in the kind of networks to which it is suited.<br />Backpropagation networks are necessarily multilayer perceptrons (usually with one input, one hidden, and one output layer). In order for the hidden layer to serve any useful function, multilayer networks must have non-linear activation functions for the multiple layers: a multilayer network using only linear activation functions is equivalent to some single layer, linear network. Non-linear activation functions that are commonly used include the logistic function, the softmax function, and the gaussian function.<br />The backpropagation algorithm for calculating a gradient has been rediscovered a number of times, and is a special case of a more general technique called automatic differentiation in the reverse accumulation mode.<br />It is also closely related to the Gauss–Newton algorithm, and is also part of continuing research in neural backpropagation.<br />[edit] Multithreaded Backpropagation<br />Backpropagation is an iterative process that can often take a great deal of time to complete. When multicore computers are used multithreaded techniques can greatly decrease the amount of time that backpropagation takes to converge. If batching is being used, it is relatively simple to adapt the backpropagation algorithm to operate in a multithreaded manor.<br />The training data is broken up into equally large batches for each of the threads. Each thread executes the forward and backward passes. The weight and threshold deltas are summed for each of the threads. At the end of each iteration all threads must pause briefly for the weight and threshold deltas to be summed and applied to the neural network. This process continues for each iteration. This multithreaded approach to backpropagation is used by the Encog Neural Network Framework. [3]<br />[edit] Limitations<br />The convergence obtained from backpropagation learning is very slow.<br />The convergence in backpropagation learning is not guaranteed.<br />The result may generally converge to any local minimum on the error surface, since stochastic gradient descent exists on a non-linear surface.<br />The backpropagation learning is associated with the problem of scaling.<br />