Self Organising Neural Networks
A Problem with Neural Networks.
Beal, R. and Jackson, T. (1990). Neural Computing: An Introduction.
Chapters 5 & 7. Adam Hilger, NY.
Hertz, J., Krogh, A. and Palmer, R. (1991). Introduction to the Theory
of Neural Computation. Chapter 9. Addison–Wesley. NY.
Grossberg, S. (1987). Competitive Learning: from interactive acti-
vation to adaptive resonance. Cognitive Science, 11: 23–63.
Kohonen Self Organising Networks
Kohonen, T. (1982). Self–organized formation of topologically cor-
rect feature maps., Biological Cybernetics, 43: 59–69.
An abstraction from earlier models (e.g. Malsburg,
The formation of feature maps (introducing a geo-
Popular and useful.
Can be traced to biologically inspired origins.
Why have topographic mappings?
– Minimal wiring
– Help subsequent processing layers.
Example: Xenopus retinotectal mapping (Price & Will-
shaw 2000, p121).
Basic Kohonen Network
Geometric arrangement of units.
Units respond to “part” of the environment.
Neighbouring units should respond to similar parts
of the environment.
Winning unit selected by:
Ü Û min Ü Û
where Û is the weight vector of winning unit, and
Ü is the input pattern.
Neighbourhoods in the Kohonen Network
Example in 2D.
Neighbourhood of winning unit called Æ .
Learning in the Kohonen Network
All units in Æ are updated.
«´Øµ Ü ´Øµ Û ´Øµ for ¾ Æ
= change in weight over time.
«´Øµ = time dependent learning parameter.
Ü ´Øµ = input component at time Ø.
Û ´Øµ = weight from input to unit at time Ø.
¯ Geometrical effect: move weight vector closer to in-
¯ « is strongest for winner and can decrease with dis-
tance. Also decreases over time for stability.
Biological origins of the Neighbourhoods: Mals-
Implements winner-take-all processing.
Example Application of Kohonen’s Network
The Phonetic Typewriter
MP Filter A/D
Problem: Classiﬁcation of phonemes in real time.
Pre and post processing.
Network trained on time sliced speech wave forms.
Rules needed to handle co-articulation effects.
A Problem with Neural Networks
Consider 3 network examples:
Feed Forward Back-propagation.
Under the situation:
Network learns environment (or I/O relations).
Network is stable in the environment.
Network is placed in a new environment.
Kohonen Network won’t learn.
Associative Network OK.
Feed Forward Back-propagation Forgets.
called The Stability/Plasticity Dilemma.
Adaptive Resonance Theory
Grossberg, S. (1976a). Adaptive pattern classiﬁcation and univer-
sal recoding I: Feedback, expectation, olfaction, illusions. Biological
Cybernetics, 23: 187–202.
a “neural network that self–organize[s] stable pat-
tern recognition codes in real time, in response to
arbitrary sequences of input patterns”.
ART1 (1976). Localist representation, binary patterns.
ART2 (1987). Localist representation, analog patterns.
ART3 (1990). Distributed representation, analog pat-
plastic + stable
analytical math foundation
+ ( )
+ (Ø )
Input (Ü )
F2 units ( )
F1 units (Ü )
F1 F2 fully connected, excitatory ( ).
F2 F1 fully connected, excitatory (Ø ).
Pattern of activation on F1 and F2 called Short Term
Weight representations called Long Term Memory.
Localist representations of binary input patterns.
Summary of ART 1
(Lippmann, 1987). N = number of F1 units.
Step 1: Initialization
Ø ½ ½
Set vigilance parameter ¼ ½
Step 2: apply new input (binary Ü )
Step 3: compute F2 activation
Step 4: ﬁnd best matching node , where .
Step 5: vigilance test
Ü Ì ¡
If no, go to step 6. If yes go to step 7.
Step 6: mismatch/reset: set ¼ and go to step 4.
Step 7: resonance — adapt best match
Ø Ø Ü
½ Ø Ü
Step 8: Re-enable all F2 units and go to step 2
UNIT 1 UNIT 2
3rd choice resonance
UNIT 3 UNIT 4
4th choice resonance
F2 UNITS REPRESENT:
Interesting biological parallels.