SlideShare a Scribd company logo
1 of 53
Next Assignment ,[object Object],[object Object],[object Object],[object Object]
ART1 Demo Increasing  vigilance  causes the network to be more selective, to introduce a new prototype when the fit is not good. Try different patterns
Hebbian Learning
Hebb’s Postulate “ When an axon of cell A is near enough to excite a cell B and repeatedly or persistently takes part in firing it, some growth process or metabolic change takes place in one or both cells such that A’s efficiency, as one of the cells firing B, is increased.” D. O. Hebb, 1949 A B In other words, when a weight contributes to firing a neuron, the weight is increased. (If the neuron doesn’t fire, then it is not).
A B
A B
Colloquial Corollaries ,[object Object]
Colloquial Corollaries? ,[object Object]
Generalized Hebb Rule ,[object Object],[object Object]
Flavors of Hebbian Learning ,[object Object],[object Object],[object Object],[object Object]
Unsupervised Hebbian Learning (aka Associative Learning)
Simple Associative Network input output
Banana Associator Unconditioned Stimulus Conditioned Stimulus Didn’t Pavlov anticipate this?
Banana Associator Demo can be toggled
Unsupervised Hebb Rule Vector Form: Training Sequence: actual response input
Learning Banana Smell Initial Weights: Training Sequence: First Iteration (sight fails, smell present):    = 1 unconditioned (shape) conditioned (smell) a 1   h a r d l i m w 0 p 0 1   w 0   p 1   0.5 – +   h a r d l i m 1 0  0 1  0.5 – +   0 (no banana) = = =
Example Second Iteration (sight works, smell present): Third Iteration (sight fails, smell present): Banana will now be detected if either sensor works. a 2   h a r d l i m w 0 p 0 2   w 1   p 2   0.5 – +   h a r d l i m 1 1  0 1  0.5 – +   1 (banana) = = =
Problems with Hebb Rule ,[object Object],[object Object]
Hebb Rule with Decay This keeps the weight matrix from growing without bound, which can be demonstrated by setting both  a i  and  p j  to 1:
Banana Associator with Decay
Example: Banana Associator First Iteration (sight fails, smell present): Second Iteration (sight works, smell present):    = 0.1    = 1 a 1   h a r d l i m w 0 p 0 1   w 0   p 1   0.5 – +   h a r d l i m 1 0  0 1  0.5 – +   0 (no banana) = = = a 2   h a r d l i m w 0 p 0 2   w 1   p 2   0.5 – +   h a r d l i m 1 1  0 1  0.5 – +   1 (banana) = = =
Example Third Iteration (sight fails, smell present):
General Decay Demo no decay larger decay w i j m a x   - - - =
Problem of Hebb with Decay Associations  will be lost if stimuli are not occasionally presented. If  a i  = 0, then If    = 0, this becomes Therefore the weight decays by 10% at each iteration where there is no stimulus.
Solution to Hebb Decay Problem ,[object Object],[object Object]
Instar (Recognition Network)
Instar Operation The instar will be active when or For normalized vectors, the largest inner product occurs when the angle between the weight vector and the input vector is zero -- the input vector is equal to the weight vector. The rows of a weight matrix represent patterns to be recognized. w T 1 p w 1 p  cos b –  =
Vector Recognition If we set the instar will only be active when    =  0. If we set the instar will be active for a range of angles.  As  b  is increased, the more patterns there will be (over a wider range of   ) which will activate the instar. b w 1 p – = b w 1 p – > w 1
Instar Rule Hebb with Decay Modify so that  learning and forgetting will only occur when the neuron is active  - Instar Rule: or Vector Form: w i j q   w i j q 1 –    a i q   p j q    a i q   w q 1 –   – + = i j
Graphical Representation For the case where the instar is active ( a i  =   1): or For the case where the instar is inactive ( a i  =   0):
Instar Demo weight vector input vector  W
Outstar ( Recall  Network)
Outstar Operation Suppose we want the outstar to recall a certain pattern  a *  whenever the input  p   =   1 is presented to the network. Let  Then, when  p   =   1 and the pattern is correctly recalled. The columns of a weight matrix represent patterns  to be recalled.
Outstar Rule For the instar rule we made the weight decay term of the Hebb rule proportional to the  output  of the network.  For the outstar rule we make the weight decay term proportional to the  input  of the network. If we make the decay rate    equal to the learning rate   , Vector Form:
Example - Pineapple Recall
Definitions
Outstar Demo
Iteration 1    = 1
Convergence
Supervised Hebbian Learning
Linear Associator Training Set:
Hebb Rule Presynaptic Signal Postsynaptic Signal Simplified Form: Supervised Form: Matrix Form: actual output input pattern desired  output
Batch Operation Matrix Form: (Zero Initial Weights)  W t 1 t 2  t Q p 1 T p 2 T p Q T T P T = = T t 1 t 2  t Q = P p 1 p 2  p Q =
Performance Analysis Case I, input patterns are orthogonal. Therefore the network output equals the target: Case II, input patterns are normalized, but not orthogonal. Error term 0 q k  =
Example Banana Apple Normalized Prototype Patterns Weight Matrix (Hebb Rule): Tests: Banana Apple
Pseudoinverse Rule - (1) Performance Index: Matrix Form: Mean-squared error T t 1 t 2  t Q = P p 1 p 2  p Q = || E || 2 e i j 2 j  i  =
Pseudoinverse Rule - (2) Minimize: If an inverse exists for  P ,  F ( W ) can be made zero: When an inverse does not exist   F ( W ) can be minimized using the pseudoinverse:
Relationship to the Hebb Rule Hebb Rule Pseudoinverse Rule If the prototype patterns are orthonormal: W T P T =
Example
Autoassociative Memory
Tests 50% Occluded 67% Occluded Noisy Patterns (7 pixels)
Supervised Hebbian Demo
Spectrum of Hebbian Learning Basic Supervised Rule: Supervised with Learning Rate: Smoothing: Delta Rule: Unsupervised: target actual

More Related Content

What's hot

Back propagation
Back propagationBack propagation
Back propagationNagarajan
 
Neural Networks: Support Vector machines
Neural Networks: Support Vector machinesNeural Networks: Support Vector machines
Neural Networks: Support Vector machinesMostafa G. M. Mostafa
 
Adaptive Resonance Theory
Adaptive Resonance TheoryAdaptive Resonance Theory
Adaptive Resonance TheoryNaveen Kumar
 
Artificial Neural Networks Lect7: Neural networks based on competition
Artificial Neural Networks Lect7: Neural networks based on competitionArtificial Neural Networks Lect7: Neural networks based on competition
Artificial Neural Networks Lect7: Neural networks based on competitionMohammed Bennamoun
 
Adaline madaline
Adaline madalineAdaline madaline
Adaline madalineNagarajan
 
Artificial Neural Network Lecture 6- Associative Memories & Discrete Hopfield...
Artificial Neural Network Lecture 6- Associative Memories & Discrete Hopfield...Artificial Neural Network Lecture 6- Associative Memories & Discrete Hopfield...
Artificial Neural Network Lecture 6- Associative Memories & Discrete Hopfield...Mohammed Bennamoun
 
Multi Layer Perceptron & Back Propagation
Multi Layer Perceptron & Back PropagationMulti Layer Perceptron & Back Propagation
Multi Layer Perceptron & Back PropagationSung-ju Kim
 
backpropagation in neural networks
backpropagation in neural networksbackpropagation in neural networks
backpropagation in neural networksAkash Goel
 
Activation function
Activation functionActivation function
Activation functionAstha Jain
 
Perceptron (neural network)
Perceptron (neural network)Perceptron (neural network)
Perceptron (neural network)EdutechLearners
 
Fuzzy rules and fuzzy reasoning
Fuzzy rules and fuzzy reasoningFuzzy rules and fuzzy reasoning
Fuzzy rules and fuzzy reasoningVeni7
 
14 Machine Learning Single Layer Perceptron
14 Machine Learning Single Layer Perceptron14 Machine Learning Single Layer Perceptron
14 Machine Learning Single Layer PerceptronAndres Mendez-Vazquez
 
lecture07.ppt
lecture07.pptlecture07.ppt
lecture07.pptbutest
 
Kohonen self organizing maps
Kohonen self organizing mapsKohonen self organizing maps
Kohonen self organizing mapsraphaelkiminya
 
Fractional Knapsack Problem
Fractional Knapsack ProblemFractional Knapsack Problem
Fractional Knapsack Problemharsh kothari
 

What's hot (20)

Back propagation
Back propagationBack propagation
Back propagation
 
ANFIS
ANFISANFIS
ANFIS
 
Neural Networks: Support Vector machines
Neural Networks: Support Vector machinesNeural Networks: Support Vector machines
Neural Networks: Support Vector machines
 
Adaptive Resonance Theory
Adaptive Resonance TheoryAdaptive Resonance Theory
Adaptive Resonance Theory
 
Artificial Neural Networks Lect7: Neural networks based on competition
Artificial Neural Networks Lect7: Neural networks based on competitionArtificial Neural Networks Lect7: Neural networks based on competition
Artificial Neural Networks Lect7: Neural networks based on competition
 
Adaline madaline
Adaline madalineAdaline madaline
Adaline madaline
 
Artificial Neural Network Lecture 6- Associative Memories & Discrete Hopfield...
Artificial Neural Network Lecture 6- Associative Memories & Discrete Hopfield...Artificial Neural Network Lecture 6- Associative Memories & Discrete Hopfield...
Artificial Neural Network Lecture 6- Associative Memories & Discrete Hopfield...
 
Multi Layer Perceptron & Back Propagation
Multi Layer Perceptron & Back PropagationMulti Layer Perceptron & Back Propagation
Multi Layer Perceptron & Back Propagation
 
Np cooks theorem
Np cooks theoremNp cooks theorem
Np cooks theorem
 
backpropagation in neural networks
backpropagation in neural networksbackpropagation in neural networks
backpropagation in neural networks
 
Activation function
Activation functionActivation function
Activation function
 
Perceptron (neural network)
Perceptron (neural network)Perceptron (neural network)
Perceptron (neural network)
 
Classical Sets & fuzzy sets
Classical Sets & fuzzy setsClassical Sets & fuzzy sets
Classical Sets & fuzzy sets
 
Fuzzy rules and fuzzy reasoning
Fuzzy rules and fuzzy reasoningFuzzy rules and fuzzy reasoning
Fuzzy rules and fuzzy reasoning
 
02 Fundamental Concepts of ANN
02 Fundamental Concepts of ANN02 Fundamental Concepts of ANN
02 Fundamental Concepts of ANN
 
14 Machine Learning Single Layer Perceptron
14 Machine Learning Single Layer Perceptron14 Machine Learning Single Layer Perceptron
14 Machine Learning Single Layer Perceptron
 
lecture07.ppt
lecture07.pptlecture07.ppt
lecture07.ppt
 
Backpropagation algo
Backpropagation  algoBackpropagation  algo
Backpropagation algo
 
Kohonen self organizing maps
Kohonen self organizing mapsKohonen self organizing maps
Kohonen self organizing maps
 
Fractional Knapsack Problem
Fractional Knapsack ProblemFractional Knapsack Problem
Fractional Knapsack Problem
 

Similar to Train counter-propagation network 7-segment coder inverse

Artificial Neural Network
Artificial Neural NetworkArtificial Neural Network
Artificial Neural Networkssuserab4f3e
 
constructing_generic_algorithms__ben_deane__cppcon_2020.pdf
constructing_generic_algorithms__ben_deane__cppcon_2020.pdfconstructing_generic_algorithms__ben_deane__cppcon_2020.pdf
constructing_generic_algorithms__ben_deane__cppcon_2020.pdfSayanSamanta39
 
Associative_Memory_Neural_Networks_pptx.pptx
Associative_Memory_Neural_Networks_pptx.pptxAssociative_Memory_Neural_Networks_pptx.pptx
Associative_Memory_Neural_Networks_pptx.pptxdgfsdf1
 
CS532L4_Backpropagation.pptx
CS532L4_Backpropagation.pptxCS532L4_Backpropagation.pptx
CS532L4_Backpropagation.pptxMFaisalRiaz5
 
Bayesian inversion of deterministic dynamic causal models
Bayesian inversion of deterministic dynamic causal modelsBayesian inversion of deterministic dynamic causal models
Bayesian inversion of deterministic dynamic causal modelskhbrodersen
 
Theories of continuous optimization
Theories of continuous optimizationTheories of continuous optimization
Theories of continuous optimizationOlivier Teytaud
 
JAISTサマースクール2016「脳を知るための理論」講義02 Synaptic Learning rules
JAISTサマースクール2016「脳を知るための理論」講義02 Synaptic Learning rulesJAISTサマースクール2016「脳を知るための理論」講義02 Synaptic Learning rules
JAISTサマースクール2016「脳を知るための理論」講義02 Synaptic Learning ruleshirokazutanaka
 
Perceptron 2015.ppt
Perceptron 2015.pptPerceptron 2015.ppt
Perceptron 2015.pptSadafAyesha9
 
Artificial Neural Network.pptx
Artificial Neural Network.pptxArtificial Neural Network.pptx
Artificial Neural Network.pptxshashankbhadouria4
 
Rules of exponents 1
Rules of exponents 1Rules of exponents 1
Rules of exponents 1lothomas
 

Similar to Train counter-propagation network 7-segment coder inverse (20)

Neural Networks
Neural NetworksNeural Networks
Neural Networks
 
Artificial Neural Network
Artificial Neural NetworkArtificial Neural Network
Artificial Neural Network
 
Unit 2
Unit 2Unit 2
Unit 2
 
constructing_generic_algorithms__ben_deane__cppcon_2020.pdf
constructing_generic_algorithms__ben_deane__cppcon_2020.pdfconstructing_generic_algorithms__ben_deane__cppcon_2020.pdf
constructing_generic_algorithms__ben_deane__cppcon_2020.pdf
 
Associative_Memory_Neural_Networks_pptx.pptx
Associative_Memory_Neural_Networks_pptx.pptxAssociative_Memory_Neural_Networks_pptx.pptx
Associative_Memory_Neural_Networks_pptx.pptx
 
Lec10
Lec10Lec10
Lec10
 
CS532L4_Backpropagation.pptx
CS532L4_Backpropagation.pptxCS532L4_Backpropagation.pptx
CS532L4_Backpropagation.pptx
 
NN-Ch3 (1).ppt
NN-Ch3 (1).pptNN-Ch3 (1).ppt
NN-Ch3 (1).ppt
 
CS767_Lecture_04.pptx
CS767_Lecture_04.pptxCS767_Lecture_04.pptx
CS767_Lecture_04.pptx
 
Unit iii update
Unit iii updateUnit iii update
Unit iii update
 
S1 Dualsimplex
S1 DualsimplexS1 Dualsimplex
S1 Dualsimplex
 
Perceptron
PerceptronPerceptron
Perceptron
 
Bayesian inversion of deterministic dynamic causal models
Bayesian inversion of deterministic dynamic causal modelsBayesian inversion of deterministic dynamic causal models
Bayesian inversion of deterministic dynamic causal models
 
Theories of continuous optimization
Theories of continuous optimizationTheories of continuous optimization
Theories of continuous optimization
 
JAISTサマースクール2016「脳を知るための理論」講義02 Synaptic Learning rules
JAISTサマースクール2016「脳を知るための理論」講義02 Synaptic Learning rulesJAISTサマースクール2016「脳を知るための理論」講義02 Synaptic Learning rules
JAISTサマースクール2016「脳を知るための理論」講義02 Synaptic Learning rules
 
Perceptron 2015.ppt
Perceptron 2015.pptPerceptron 2015.ppt
Perceptron 2015.ppt
 
Artificial Neural Network.pptx
Artificial Neural Network.pptxArtificial Neural Network.pptx
Artificial Neural Network.pptx
 
Rules of exponents 1
Rules of exponents 1Rules of exponents 1
Rules of exponents 1
 
Exponent 2
Exponent 2Exponent 2
Exponent 2
 
NN-Ch3.PDF
NN-Ch3.PDFNN-Ch3.PDF
NN-Ch3.PDF
 

More from ESCOM

redes neuronales tipo Som
redes neuronales tipo Somredes neuronales tipo Som
redes neuronales tipo SomESCOM
 
redes neuronales Som
redes neuronales Somredes neuronales Som
redes neuronales SomESCOM
 
redes neuronales Som Slides
redes neuronales Som Slidesredes neuronales Som Slides
redes neuronales Som SlidesESCOM
 
red neuronal Som Net
red neuronal Som Netred neuronal Som Net
red neuronal Som NetESCOM
 
Self Organinising neural networks
Self Organinising  neural networksSelf Organinising  neural networks
Self Organinising neural networksESCOM
 
redes neuronales Kohonen
redes neuronales Kohonenredes neuronales Kohonen
redes neuronales KohonenESCOM
 
Teoria Resonancia Adaptativa
Teoria Resonancia AdaptativaTeoria Resonancia Adaptativa
Teoria Resonancia AdaptativaESCOM
 
ejemplo red neuronal Art1
ejemplo red neuronal Art1ejemplo red neuronal Art1
ejemplo red neuronal Art1ESCOM
 
redes neuronales tipo Art3
redes neuronales tipo Art3redes neuronales tipo Art3
redes neuronales tipo Art3ESCOM
 
Art2
Art2Art2
Art2ESCOM
 
Redes neuronales tipo Art
Redes neuronales tipo ArtRedes neuronales tipo Art
Redes neuronales tipo ArtESCOM
 
Neocognitron
NeocognitronNeocognitron
NeocognitronESCOM
 
Neocognitron
NeocognitronNeocognitron
NeocognitronESCOM
 
Neocognitron
NeocognitronNeocognitron
NeocognitronESCOM
 
Fukushima Cognitron
Fukushima CognitronFukushima Cognitron
Fukushima CognitronESCOM
 
Counterpropagation NETWORK
Counterpropagation NETWORKCounterpropagation NETWORK
Counterpropagation NETWORKESCOM
 
Counterpropagation NETWORK
Counterpropagation NETWORKCounterpropagation NETWORK
Counterpropagation NETWORKESCOM
 
Counterpropagation
CounterpropagationCounterpropagation
CounterpropagationESCOM
 
Teoría de Resonancia Adaptativa Art2 ARTMAP
Teoría de Resonancia Adaptativa Art2 ARTMAPTeoría de Resonancia Adaptativa Art2 ARTMAP
Teoría de Resonancia Adaptativa Art2 ARTMAPESCOM
 
Teoría de Resonancia Adaptativa ART1
Teoría de Resonancia Adaptativa ART1Teoría de Resonancia Adaptativa ART1
Teoría de Resonancia Adaptativa ART1ESCOM
 

More from ESCOM (20)

redes neuronales tipo Som
redes neuronales tipo Somredes neuronales tipo Som
redes neuronales tipo Som
 
redes neuronales Som
redes neuronales Somredes neuronales Som
redes neuronales Som
 
redes neuronales Som Slides
redes neuronales Som Slidesredes neuronales Som Slides
redes neuronales Som Slides
 
red neuronal Som Net
red neuronal Som Netred neuronal Som Net
red neuronal Som Net
 
Self Organinising neural networks
Self Organinising  neural networksSelf Organinising  neural networks
Self Organinising neural networks
 
redes neuronales Kohonen
redes neuronales Kohonenredes neuronales Kohonen
redes neuronales Kohonen
 
Teoria Resonancia Adaptativa
Teoria Resonancia AdaptativaTeoria Resonancia Adaptativa
Teoria Resonancia Adaptativa
 
ejemplo red neuronal Art1
ejemplo red neuronal Art1ejemplo red neuronal Art1
ejemplo red neuronal Art1
 
redes neuronales tipo Art3
redes neuronales tipo Art3redes neuronales tipo Art3
redes neuronales tipo Art3
 
Art2
Art2Art2
Art2
 
Redes neuronales tipo Art
Redes neuronales tipo ArtRedes neuronales tipo Art
Redes neuronales tipo Art
 
Neocognitron
NeocognitronNeocognitron
Neocognitron
 
Neocognitron
NeocognitronNeocognitron
Neocognitron
 
Neocognitron
NeocognitronNeocognitron
Neocognitron
 
Fukushima Cognitron
Fukushima CognitronFukushima Cognitron
Fukushima Cognitron
 
Counterpropagation NETWORK
Counterpropagation NETWORKCounterpropagation NETWORK
Counterpropagation NETWORK
 
Counterpropagation NETWORK
Counterpropagation NETWORKCounterpropagation NETWORK
Counterpropagation NETWORK
 
Counterpropagation
CounterpropagationCounterpropagation
Counterpropagation
 
Teoría de Resonancia Adaptativa Art2 ARTMAP
Teoría de Resonancia Adaptativa Art2 ARTMAPTeoría de Resonancia Adaptativa Art2 ARTMAP
Teoría de Resonancia Adaptativa Art2 ARTMAP
 
Teoría de Resonancia Adaptativa ART1
Teoría de Resonancia Adaptativa ART1Teoría de Resonancia Adaptativa ART1
Teoría de Resonancia Adaptativa ART1
 

Recently uploaded

Activity 2-unit 2-update 2024. English translation
Activity 2-unit 2-update 2024. English translationActivity 2-unit 2-update 2024. English translation
Activity 2-unit 2-update 2024. English translationRosabel UA
 
4.16.24 Poverty and Precarity--Desmond.pptx
4.16.24 Poverty and Precarity--Desmond.pptx4.16.24 Poverty and Precarity--Desmond.pptx
4.16.24 Poverty and Precarity--Desmond.pptxmary850239
 
Q4-PPT-Music9_Lesson-1-Romantic-Opera.pptx
Q4-PPT-Music9_Lesson-1-Romantic-Opera.pptxQ4-PPT-Music9_Lesson-1-Romantic-Opera.pptx
Q4-PPT-Music9_Lesson-1-Romantic-Opera.pptxlancelewisportillo
 
Concurrency Control in Database Management system
Concurrency Control in Database Management systemConcurrency Control in Database Management system
Concurrency Control in Database Management systemChristalin Nelson
 
Daily Lesson Plan in Mathematics Quarter 4
Daily Lesson Plan in Mathematics Quarter 4Daily Lesson Plan in Mathematics Quarter 4
Daily Lesson Plan in Mathematics Quarter 4JOYLYNSAMANIEGO
 
ENG 5 Q4 WEEk 1 DAY 1 Restate sentences heard in one’s own words. Use appropr...
ENG 5 Q4 WEEk 1 DAY 1 Restate sentences heard in one’s own words. Use appropr...ENG 5 Q4 WEEk 1 DAY 1 Restate sentences heard in one’s own words. Use appropr...
ENG 5 Q4 WEEk 1 DAY 1 Restate sentences heard in one’s own words. Use appropr...JojoEDelaCruz
 
HỌC TỐT TIẾNG ANH 11 THEO CHƯƠNG TRÌNH GLOBAL SUCCESS ĐÁP ÁN CHI TIẾT - CẢ NĂ...
HỌC TỐT TIẾNG ANH 11 THEO CHƯƠNG TRÌNH GLOBAL SUCCESS ĐÁP ÁN CHI TIẾT - CẢ NĂ...HỌC TỐT TIẾNG ANH 11 THEO CHƯƠNG TRÌNH GLOBAL SUCCESS ĐÁP ÁN CHI TIẾT - CẢ NĂ...
HỌC TỐT TIẾNG ANH 11 THEO CHƯƠNG TRÌNH GLOBAL SUCCESS ĐÁP ÁN CHI TIẾT - CẢ NĂ...Nguyen Thanh Tu Collection
 
4.16.24 21st Century Movements for Black Lives.pptx
4.16.24 21st Century Movements for Black Lives.pptx4.16.24 21st Century Movements for Black Lives.pptx
4.16.24 21st Century Movements for Black Lives.pptxmary850239
 
Visit to a blind student's school🧑‍🦯🧑‍🦯(community medicine)
Visit to a blind student's school🧑‍🦯🧑‍🦯(community medicine)Visit to a blind student's school🧑‍🦯🧑‍🦯(community medicine)
Visit to a blind student's school🧑‍🦯🧑‍🦯(community medicine)lakshayb543
 
Integumentary System SMP B. Pharm Sem I.ppt
Integumentary System SMP B. Pharm Sem I.pptIntegumentary System SMP B. Pharm Sem I.ppt
Integumentary System SMP B. Pharm Sem I.pptshraddhaparab530
 
MULTIDISCIPLINRY NATURE OF THE ENVIRONMENTAL STUDIES.pptx
MULTIDISCIPLINRY NATURE OF THE ENVIRONMENTAL STUDIES.pptxMULTIDISCIPLINRY NATURE OF THE ENVIRONMENTAL STUDIES.pptx
MULTIDISCIPLINRY NATURE OF THE ENVIRONMENTAL STUDIES.pptxAnupkumar Sharma
 
ECONOMIC CONTEXT - PAPER 1 Q3: NEWSPAPERS.pptx
ECONOMIC CONTEXT - PAPER 1 Q3: NEWSPAPERS.pptxECONOMIC CONTEXT - PAPER 1 Q3: NEWSPAPERS.pptx
ECONOMIC CONTEXT - PAPER 1 Q3: NEWSPAPERS.pptxiammrhaywood
 
USPS® Forced Meter Migration - How to Know if Your Postage Meter Will Soon be...
USPS® Forced Meter Migration - How to Know if Your Postage Meter Will Soon be...USPS® Forced Meter Migration - How to Know if Your Postage Meter Will Soon be...
USPS® Forced Meter Migration - How to Know if Your Postage Meter Will Soon be...Postal Advocate Inc.
 
Difference Between Search & Browse Methods in Odoo 17
Difference Between Search & Browse Methods in Odoo 17Difference Between Search & Browse Methods in Odoo 17
Difference Between Search & Browse Methods in Odoo 17Celine George
 
ENGLISH 7_Q4_LESSON 2_ Employing a Variety of Strategies for Effective Interp...
ENGLISH 7_Q4_LESSON 2_ Employing a Variety of Strategies for Effective Interp...ENGLISH 7_Q4_LESSON 2_ Employing a Variety of Strategies for Effective Interp...
ENGLISH 7_Q4_LESSON 2_ Employing a Variety of Strategies for Effective Interp...JhezDiaz1
 
Music 9 - 4th quarter - Vocal Music of the Romantic Period.pptx
Music 9 - 4th quarter - Vocal Music of the Romantic Period.pptxMusic 9 - 4th quarter - Vocal Music of the Romantic Period.pptx
Music 9 - 4th quarter - Vocal Music of the Romantic Period.pptxleah joy valeriano
 
How to do quick user assign in kanban in Odoo 17 ERP
How to do quick user assign in kanban in Odoo 17 ERPHow to do quick user assign in kanban in Odoo 17 ERP
How to do quick user assign in kanban in Odoo 17 ERPCeline George
 
Virtual-Orientation-on-the-Administration-of-NATG12-NATG6-and-ELLNA.pdf
Virtual-Orientation-on-the-Administration-of-NATG12-NATG6-and-ELLNA.pdfVirtual-Orientation-on-the-Administration-of-NATG12-NATG6-and-ELLNA.pdf
Virtual-Orientation-on-the-Administration-of-NATG12-NATG6-and-ELLNA.pdfErwinPantujan2
 
Incoming and Outgoing Shipments in 3 STEPS Using Odoo 17
Incoming and Outgoing Shipments in 3 STEPS Using Odoo 17Incoming and Outgoing Shipments in 3 STEPS Using Odoo 17
Incoming and Outgoing Shipments in 3 STEPS Using Odoo 17Celine George
 

Recently uploaded (20)

Activity 2-unit 2-update 2024. English translation
Activity 2-unit 2-update 2024. English translationActivity 2-unit 2-update 2024. English translation
Activity 2-unit 2-update 2024. English translation
 
4.16.24 Poverty and Precarity--Desmond.pptx
4.16.24 Poverty and Precarity--Desmond.pptx4.16.24 Poverty and Precarity--Desmond.pptx
4.16.24 Poverty and Precarity--Desmond.pptx
 
Q4-PPT-Music9_Lesson-1-Romantic-Opera.pptx
Q4-PPT-Music9_Lesson-1-Romantic-Opera.pptxQ4-PPT-Music9_Lesson-1-Romantic-Opera.pptx
Q4-PPT-Music9_Lesson-1-Romantic-Opera.pptx
 
Concurrency Control in Database Management system
Concurrency Control in Database Management systemConcurrency Control in Database Management system
Concurrency Control in Database Management system
 
Daily Lesson Plan in Mathematics Quarter 4
Daily Lesson Plan in Mathematics Quarter 4Daily Lesson Plan in Mathematics Quarter 4
Daily Lesson Plan in Mathematics Quarter 4
 
ENG 5 Q4 WEEk 1 DAY 1 Restate sentences heard in one’s own words. Use appropr...
ENG 5 Q4 WEEk 1 DAY 1 Restate sentences heard in one’s own words. Use appropr...ENG 5 Q4 WEEk 1 DAY 1 Restate sentences heard in one’s own words. Use appropr...
ENG 5 Q4 WEEk 1 DAY 1 Restate sentences heard in one’s own words. Use appropr...
 
YOUVE GOT EMAIL_FINALS_EL_DORADO_2024.pptx
YOUVE GOT EMAIL_FINALS_EL_DORADO_2024.pptxYOUVE GOT EMAIL_FINALS_EL_DORADO_2024.pptx
YOUVE GOT EMAIL_FINALS_EL_DORADO_2024.pptx
 
HỌC TỐT TIẾNG ANH 11 THEO CHƯƠNG TRÌNH GLOBAL SUCCESS ĐÁP ÁN CHI TIẾT - CẢ NĂ...
HỌC TỐT TIẾNG ANH 11 THEO CHƯƠNG TRÌNH GLOBAL SUCCESS ĐÁP ÁN CHI TIẾT - CẢ NĂ...HỌC TỐT TIẾNG ANH 11 THEO CHƯƠNG TRÌNH GLOBAL SUCCESS ĐÁP ÁN CHI TIẾT - CẢ NĂ...
HỌC TỐT TIẾNG ANH 11 THEO CHƯƠNG TRÌNH GLOBAL SUCCESS ĐÁP ÁN CHI TIẾT - CẢ NĂ...
 
4.16.24 21st Century Movements for Black Lives.pptx
4.16.24 21st Century Movements for Black Lives.pptx4.16.24 21st Century Movements for Black Lives.pptx
4.16.24 21st Century Movements for Black Lives.pptx
 
Visit to a blind student's school🧑‍🦯🧑‍🦯(community medicine)
Visit to a blind student's school🧑‍🦯🧑‍🦯(community medicine)Visit to a blind student's school🧑‍🦯🧑‍🦯(community medicine)
Visit to a blind student's school🧑‍🦯🧑‍🦯(community medicine)
 
Integumentary System SMP B. Pharm Sem I.ppt
Integumentary System SMP B. Pharm Sem I.pptIntegumentary System SMP B. Pharm Sem I.ppt
Integumentary System SMP B. Pharm Sem I.ppt
 
MULTIDISCIPLINRY NATURE OF THE ENVIRONMENTAL STUDIES.pptx
MULTIDISCIPLINRY NATURE OF THE ENVIRONMENTAL STUDIES.pptxMULTIDISCIPLINRY NATURE OF THE ENVIRONMENTAL STUDIES.pptx
MULTIDISCIPLINRY NATURE OF THE ENVIRONMENTAL STUDIES.pptx
 
ECONOMIC CONTEXT - PAPER 1 Q3: NEWSPAPERS.pptx
ECONOMIC CONTEXT - PAPER 1 Q3: NEWSPAPERS.pptxECONOMIC CONTEXT - PAPER 1 Q3: NEWSPAPERS.pptx
ECONOMIC CONTEXT - PAPER 1 Q3: NEWSPAPERS.pptx
 
USPS® Forced Meter Migration - How to Know if Your Postage Meter Will Soon be...
USPS® Forced Meter Migration - How to Know if Your Postage Meter Will Soon be...USPS® Forced Meter Migration - How to Know if Your Postage Meter Will Soon be...
USPS® Forced Meter Migration - How to Know if Your Postage Meter Will Soon be...
 
Difference Between Search & Browse Methods in Odoo 17
Difference Between Search & Browse Methods in Odoo 17Difference Between Search & Browse Methods in Odoo 17
Difference Between Search & Browse Methods in Odoo 17
 
ENGLISH 7_Q4_LESSON 2_ Employing a Variety of Strategies for Effective Interp...
ENGLISH 7_Q4_LESSON 2_ Employing a Variety of Strategies for Effective Interp...ENGLISH 7_Q4_LESSON 2_ Employing a Variety of Strategies for Effective Interp...
ENGLISH 7_Q4_LESSON 2_ Employing a Variety of Strategies for Effective Interp...
 
Music 9 - 4th quarter - Vocal Music of the Romantic Period.pptx
Music 9 - 4th quarter - Vocal Music of the Romantic Period.pptxMusic 9 - 4th quarter - Vocal Music of the Romantic Period.pptx
Music 9 - 4th quarter - Vocal Music of the Romantic Period.pptx
 
How to do quick user assign in kanban in Odoo 17 ERP
How to do quick user assign in kanban in Odoo 17 ERPHow to do quick user assign in kanban in Odoo 17 ERP
How to do quick user assign in kanban in Odoo 17 ERP
 
Virtual-Orientation-on-the-Administration-of-NATG12-NATG6-and-ELLNA.pdf
Virtual-Orientation-on-the-Administration-of-NATG12-NATG6-and-ELLNA.pdfVirtual-Orientation-on-the-Administration-of-NATG12-NATG6-and-ELLNA.pdf
Virtual-Orientation-on-the-Administration-of-NATG12-NATG6-and-ELLNA.pdf
 
Incoming and Outgoing Shipments in 3 STEPS Using Odoo 17
Incoming and Outgoing Shipments in 3 STEPS Using Odoo 17Incoming and Outgoing Shipments in 3 STEPS Using Odoo 17
Incoming and Outgoing Shipments in 3 STEPS Using Odoo 17
 

Train counter-propagation network 7-segment coder inverse

  • 1.
  • 2. ART1 Demo Increasing vigilance causes the network to be more selective, to introduce a new prototype when the fit is not good. Try different patterns
  • 4. Hebb’s Postulate “ When an axon of cell A is near enough to excite a cell B and repeatedly or persistently takes part in firing it, some growth process or metabolic change takes place in one or both cells such that A’s efficiency, as one of the cells firing B, is increased.” D. O. Hebb, 1949 A B In other words, when a weight contributes to firing a neuron, the weight is increased. (If the neuron doesn’t fire, then it is not).
  • 5. A B
  • 6. A B
  • 7.
  • 8.
  • 9.
  • 10.
  • 11. Unsupervised Hebbian Learning (aka Associative Learning)
  • 13. Banana Associator Unconditioned Stimulus Conditioned Stimulus Didn’t Pavlov anticipate this?
  • 14. Banana Associator Demo can be toggled
  • 15. Unsupervised Hebb Rule Vector Form: Training Sequence: actual response input
  • 16. Learning Banana Smell Initial Weights: Training Sequence: First Iteration (sight fails, smell present):  = 1 unconditioned (shape) conditioned (smell) a 1   h a r d l i m w 0 p 0 1   w 0   p 1   0.5 – +   h a r d l i m 1 0  0 1  0.5 – +   0 (no banana) = = =
  • 17. Example Second Iteration (sight works, smell present): Third Iteration (sight fails, smell present): Banana will now be detected if either sensor works. a 2   h a r d l i m w 0 p 0 2   w 1   p 2   0.5 – +   h a r d l i m 1 1  0 1  0.5 – +   1 (banana) = = =
  • 18.
  • 19. Hebb Rule with Decay This keeps the weight matrix from growing without bound, which can be demonstrated by setting both a i and p j to 1:
  • 21. Example: Banana Associator First Iteration (sight fails, smell present): Second Iteration (sight works, smell present):  = 0.1  = 1 a 1   h a r d l i m w 0 p 0 1   w 0   p 1   0.5 – +   h a r d l i m 1 0  0 1  0.5 – +   0 (no banana) = = = a 2   h a r d l i m w 0 p 0 2   w 1   p 2   0.5 – +   h a r d l i m 1 1  0 1  0.5 – +   1 (banana) = = =
  • 22. Example Third Iteration (sight fails, smell present):
  • 23. General Decay Demo no decay larger decay w i j m a x   - - - =
  • 24. Problem of Hebb with Decay Associations will be lost if stimuli are not occasionally presented. If a i = 0, then If  = 0, this becomes Therefore the weight decays by 10% at each iteration where there is no stimulus.
  • 25.
  • 27. Instar Operation The instar will be active when or For normalized vectors, the largest inner product occurs when the angle between the weight vector and the input vector is zero -- the input vector is equal to the weight vector. The rows of a weight matrix represent patterns to be recognized. w T 1 p w 1 p  cos b –  =
  • 28. Vector Recognition If we set the instar will only be active when   =  0. If we set the instar will be active for a range of angles. As b is increased, the more patterns there will be (over a wider range of  ) which will activate the instar. b w 1 p – = b w 1 p – > w 1
  • 29. Instar Rule Hebb with Decay Modify so that learning and forgetting will only occur when the neuron is active - Instar Rule: or Vector Form: w i j q   w i j q 1 –    a i q   p j q    a i q   w q 1 –   – + = i j
  • 30. Graphical Representation For the case where the instar is active ( a i = 1): or For the case where the instar is inactive ( a i = 0):
  • 31. Instar Demo weight vector input vector  W
  • 32. Outstar ( Recall Network)
  • 33. Outstar Operation Suppose we want the outstar to recall a certain pattern a * whenever the input p = 1 is presented to the network. Let Then, when p = 1 and the pattern is correctly recalled. The columns of a weight matrix represent patterns to be recalled.
  • 34. Outstar Rule For the instar rule we made the weight decay term of the Hebb rule proportional to the output of the network. For the outstar rule we make the weight decay term proportional to the input of the network. If we make the decay rate  equal to the learning rate  , Vector Form:
  • 42. Hebb Rule Presynaptic Signal Postsynaptic Signal Simplified Form: Supervised Form: Matrix Form: actual output input pattern desired output
  • 43. Batch Operation Matrix Form: (Zero Initial Weights)  W t 1 t 2  t Q p 1 T p 2 T p Q T T P T = = T t 1 t 2  t Q = P p 1 p 2  p Q =
  • 44. Performance Analysis Case I, input patterns are orthogonal. Therefore the network output equals the target: Case II, input patterns are normalized, but not orthogonal. Error term 0 q k  =
  • 45. Example Banana Apple Normalized Prototype Patterns Weight Matrix (Hebb Rule): Tests: Banana Apple
  • 46. Pseudoinverse Rule - (1) Performance Index: Matrix Form: Mean-squared error T t 1 t 2  t Q = P p 1 p 2  p Q = || E || 2 e i j 2 j  i  =
  • 47. Pseudoinverse Rule - (2) Minimize: If an inverse exists for P , F ( W ) can be made zero: When an inverse does not exist F ( W ) can be minimized using the pseudoinverse:
  • 48. Relationship to the Hebb Rule Hebb Rule Pseudoinverse Rule If the prototype patterns are orthonormal: W T P T =
  • 51. Tests 50% Occluded 67% Occluded Noisy Patterns (7 pixels)
  • 53. Spectrum of Hebbian Learning Basic Supervised Rule: Supervised with Learning Rate: Smoothing: Delta Rule: Unsupervised: target actual