SlideShare a Scribd company logo
1 of 109
Download to read offline
Artificial Neural Networks (ANNs)
XOR Step-By-Step
MENOUFIA UNIVERSITY
FACULTY OF COMPUTERS AND INFORMATION
ALL DEPARTMENTS
ARTIFICIAL INTELLIGENCE
โ€ซุงู„ู…ู†ูˆููŠุฉโ€ฌ โ€ซุฌุงู…ุนุฉโ€ฌ
โ€ซูˆุงู„ู…ุนู„ูˆู…ุงุชโ€ฌ โ€ซุงู„ุญุงุณุจุงุชโ€ฌ โ€ซูƒู„ูŠุฉโ€ฌ
โ€ซุงุฃู„ู‚ุณุงู…โ€ฌ โ€ซุฌู…ูŠุนโ€ฌ
โ€ซุงู„ุฐูƒุงุกโ€ฌโ€ซุงุฅู„ุตุทู†ุงุนูŠโ€ฌ
โ€ซุงู„ู…ู†ูˆููŠุฉโ€ฌ โ€ซุฌุงู…ุนุฉโ€ฌ
Ahmed Fawzy Gad
ahmed.fawzy@ci.menofia.edu.eg
Classification Example
BA
01
1
10
00
0
11
Neural Networks
Input Hidden Output
BA
01
1
10
00
0
11
Neural Networks
BA
01
1
10
00
0
11
0
0.2
0.4
0.6
0.8
1
1.2
0 0.2 0.4 0.6 0.8 1 1.2
Neural Networks
BA
01
1
10
00
0
11
0
0.2
0.4
0.6
0.8
1
1.2
0 0.2 0.4 0.6 0.8 1 1.2
Can`t be Solved Linearly.
Single Layer Perceptron Can`t Work.
Use Hidden Layer.
Neural Networks
BA
01
1
10
00
0
11
Input OutputHidden
Input Layer
Input Output
A
B
BA
01
1
10
00
0
11
0
0.2
0.4
0.6
0.8
1
1.2
0 0.2 0.4 0.6 0.8 1 1.2
Hidden
Hidden Layer
Start by Two Neurons
Input Output
BA
01
1
10
00
0
11
0
0.2
0.4
0.6
0.8
1
1.2
0 0.2 0.4 0.6 0.8 1 1.2
Hidden
A
B
Output Layer
Input Output
BA
01
1
10
00
0
11
0
0.2
0.4
0.6
0.8
1
1.2
0 0.2 0.4 0.6 0.8 1 1.2
Hidden
1/0
๐’€๐’‹
A
B
Weights
Input Output
BA
01
1
10
00
0
11
0
0.2
0.4
0.6
0.8
1
1.2
0 0.2 0.4 0.6 0.8 1 1.2
Hidden
Weights=๐‘พ๐’Š
A
B
1/0
๐’€๐’‹
Weights
Input Layer โ€“ Hidden Layer
Input Output
BA
01
1
10
00
0
11
0
0.2
0.4
0.6
0.8
1
1.2
0 0.2 0.4 0.6 0.8 1 1.2
Hidden
Weights=๐‘พ๐’Š
๐‘พ ๐Ÿ
๐‘พ ๐Ÿ
๐‘พ ๐Ÿ‘
๐‘พ ๐Ÿ’
A
B
1/0
๐’€๐’‹
Weights
Hidden Layer โ€“ Output Layer
Input Output
BA
01
1
10
00
0
11
0
0.2
0.4
0.6
0.8
1
1.2
0 0.2 0.4 0.6 0.8 1 1.2
Hidden
Weights=๐‘พ๐’Š
๐‘พ ๐Ÿ“
๐‘พ ๐Ÿ”
A
B
1/0
๐’€๐’‹
๐‘พ ๐Ÿ
๐‘พ ๐Ÿ
๐‘พ ๐Ÿ‘
๐‘พ ๐Ÿ’
All Layers
Input Output
BA
01
1
10
00
0
11
0
0.2
0.4
0.6
0.8
1
1.2
0 0.2 0.4 0.6 0.8 1 1.2
Hidden
Weights=๐‘พ๐’Š
๐‘พ ๐Ÿ“
๐‘พ ๐Ÿ”
1/0
๐’€๐’‹
A
B
๐‘พ ๐Ÿ
๐‘พ ๐Ÿ
๐‘พ ๐Ÿ‘
๐‘พ ๐Ÿ’
Activation Function
Output
๐’€๐’‹
BA
01
1
10
00
0
11
Input Hidden
1/0
0
0.2
0.4
0.6
0.8
1
1.2
0 0.2 0.4 0.6 0.8 1 1.2
๐‘พ ๐Ÿ“
๐‘พ ๐Ÿ”
A
B
๐‘พ ๐Ÿ
๐‘พ ๐Ÿ
๐‘พ ๐Ÿ‘
๐‘พ ๐Ÿ’
Activation Function
Output
๐’€๐’‹
BA
01
1
10
00
0
11
Input Hidden
0
0.2
0.4
0.6
0.8
1
1.2
0 0.2 0.4 0.6 0.8 1 1.2
1/0
๐‘พ ๐Ÿ“
๐‘พ ๐Ÿ”
A
B
๐‘พ ๐Ÿ
๐‘พ ๐Ÿ
๐‘พ ๐Ÿ‘
๐‘พ ๐Ÿ’
Activation Function
Output
๐’€๐’‹
BA
01
1
10
00
0
11
0
0.2
0.4
0.6
0.8
1
1.2
0 0.2 0.4 0.6 0.8 1 1.2
Input Hidden
1/0
๐‘พ ๐Ÿ“
๐‘พ ๐Ÿ”
A
B
๐‘พ ๐Ÿ
๐‘พ ๐Ÿ
๐‘พ ๐Ÿ‘
๐‘พ ๐Ÿ’
Activation Function
Components
Output
๐’€๐’‹
BA
01
1
10
00
0
11
0
0.2
0.4
0.6
0.8
1
1.2
0 0.2 0.4 0.6 0.8 1 1.2
Input Hidden
1/0
๐‘พ ๐Ÿ“
๐‘พ ๐Ÿ”
A
B
๐‘พ ๐Ÿ
๐‘พ ๐Ÿ
๐‘พ ๐Ÿ‘
๐‘พ ๐Ÿ’
Activation Function
Inputs
Output
s
๐’€๐’‹
BA
01
1
10
00
0
11
0
0.2
0.4
0.6
0.8
1
1.2
0 0.2 0.4 0.6 0.8 1 1.2
Input Hidden
1/0
๐‘พ ๐Ÿ“
๐‘พ ๐Ÿ”
A
B
๐‘พ ๐Ÿ
๐‘พ ๐Ÿ
๐‘พ ๐Ÿ‘
๐‘พ ๐Ÿ’
Activation Function
Inputs
Output
s
๐’€๐’‹
BA
01
1
10
00
0
11
0
0.2
0.4
0.6
0.8
1
1.2
0 0.2 0.4 0.6 0.8 1 1.2
Input Hidden
1/0
๐‘พ ๐Ÿ“
๐‘พ ๐Ÿ”
A
B
๐‘พ ๐Ÿ
๐‘พ ๐Ÿ
๐‘พ ๐Ÿ‘
๐‘พ ๐Ÿ’
s=SOP(๐‘ฟ๐’Š, ๐‘พ๐’Š)
Activation Function
Inputs
Output
s
๐‘ฟ๐’Š=Inputs ๐‘พ๐’Š=Weights
๐’€๐’‹
BA
01
1
10
00
0
11
0
0.2
0.4
0.6
0.8
1
1.2
0 0.2 0.4 0.6 0.8 1 1.2
Input Hidden
1/0
๐‘พ ๐Ÿ“
๐‘พ ๐Ÿ”
A
B
๐‘พ ๐Ÿ
๐‘พ ๐Ÿ
๐‘พ ๐Ÿ‘
๐‘พ ๐Ÿ’
s=SOP(๐‘ฟ๐’Š, ๐‘พ๐’Š)
Activation Function
Inputs
Output
s
๐‘ฟ๐’Š=Inputs ๐‘พ๐’Š=Weights
๐‘ฟ ๐Ÿ
๐‘ฟ ๐Ÿ
๐’€๐’‹
BA
01
1
10
00
0
11
0
0.2
0.4
0.6
0.8
1
1.2
0 0.2 0.4 0.6 0.8 1 1.2
Input Hidden
1/0
๐‘พ ๐Ÿ“
๐‘พ ๐Ÿ”
A
B
๐‘พ ๐Ÿ
๐‘พ ๐Ÿ
๐‘พ ๐Ÿ‘
๐‘พ ๐Ÿ’
s=SOP(๐‘ฟ๐’Š, ๐‘พ๐’Š)
๐‘พ ๐Ÿ“
๐‘พ ๐Ÿ”
A
B
๐‘พ ๐Ÿ
๐‘พ ๐Ÿ
๐‘พ ๐Ÿ‘
๐‘พ ๐Ÿ’
Activation Function
Inputs
Output
s
๐‘ฟ๐’Š=Inputs ๐‘พ๐’Š=Weights
๐‘ฟ ๐Ÿ
๐‘ฟ ๐Ÿ
๐’€๐’‹
BA
01
1
10
00
0
11
0
0.2
0.4
0.6
0.8
1
1.2
0 0.2 0.4 0.6 0.8 1 1.2
Input Hidden
S= ๐Ÿ
๐’Ž
๐‘ฟ๐’Š ๐‘พ๐’Š
s=SOP(๐‘ฟ๐’Š, ๐‘พ๐’Š)
1/0
Activation Function
Inputs
Output
s
๐‘ฟ ๐Ÿ
๐‘ฟ ๐Ÿ
๐’€๐’‹
BA
01
1
10
00
0
11
0
0.2
0.4
0.6
0.8
1
1.2
0 0.2 0.4 0.6 0.8 1 1.2
Input Hidden
๐‘พ ๐Ÿ“
๐‘พ ๐Ÿ”
A
B
๐‘พ ๐Ÿ
๐‘พ ๐Ÿ
๐‘พ ๐Ÿ‘
๐‘พ ๐Ÿ’
S= ๐Ÿ
๐’Ž
๐‘ฟ๐’Š ๐‘พ๐’Š
1/0
Each Hidden/Output Layer
Neuron has its SOP.
Activation Function
Inputs
Output
s
๐‘ฟ ๐Ÿ
๐‘ฟ ๐Ÿ
๐‘บ ๐Ÿ=(๐‘ฟ ๐Ÿ ๐‘พ ๐Ÿ+๐‘ฟ ๐Ÿ ๐‘พ ๐Ÿ‘)
๐’€๐’‹
BA
01
1
10
00
0
11
0
0.2
0.4
0.6
0.8
1
1.2
0 0.2 0.4 0.6 0.8 1 1.2
Input Hidden
๐‘พ ๐Ÿ“
๐‘พ ๐Ÿ”
A
B
๐‘พ ๐Ÿ
๐‘พ ๐Ÿ
๐‘พ ๐Ÿ‘
๐‘พ ๐Ÿ’
S= ๐Ÿ
๐’Ž
๐‘ฟ๐’Š ๐‘พ๐’Š
1/0
Activation Function
Inputs
Output
s
๐‘ฟ ๐Ÿ
๐‘ฟ ๐Ÿ
๐‘บ ๐Ÿ=(๐‘ฟ ๐Ÿ ๐‘พ ๐Ÿ+๐‘ฟ ๐Ÿ ๐‘พ ๐Ÿ’)
๐’€๐’‹
BA
01
1
10
00
0
11
0
0.2
0.4
0.6
0.8
1
1.2
0 0.2 0.4 0.6 0.8 1 1.2
Input Hidden
๐‘พ ๐Ÿ“
๐‘พ ๐Ÿ”
A
B
๐‘พ ๐Ÿ
๐‘พ ๐Ÿ
๐‘พ ๐Ÿ‘
๐‘พ ๐Ÿ’
S= ๐Ÿ
๐’Ž
๐‘ฟ๐’Š ๐‘พ๐’Š
1/0
Activation Function
Inputs
Output
s
๐‘ฟ ๐Ÿ
๐‘ฟ ๐Ÿ
๐‘บ ๐Ÿ‘=(๐‘บ ๐Ÿ ๐‘พ ๐Ÿ“+๐‘บ ๐Ÿ ๐‘พ ๐Ÿ”)
๐’€๐’‹
BA
01
1
10
00
0
11
0
0.2
0.4
0.6
0.8
1
1.2
0 0.2 0.4 0.6 0.8 1 1.2
Input Hidden
๐‘พ ๐Ÿ“
๐‘พ ๐Ÿ”
A
B
๐‘พ ๐Ÿ
๐‘พ ๐Ÿ
๐‘พ ๐Ÿ‘
๐‘พ ๐Ÿ’
S= ๐Ÿ
๐’Ž
๐‘ฟ๐’Š ๐‘พ๐’Š
1/0
Activation Function
Outputs
Output
F(s)s
๐‘ฟ ๐Ÿ
๐‘ฟ ๐Ÿ
Class Label
๐’€๐’‹
BA
01
1
10
00
0
11
0
0.2
0.4
0.6
0.8
1
1.2
0 0.2 0.4 0.6 0.8 1 1.2
Input Hidden
1/0
๐‘พ ๐Ÿ“
๐‘พ ๐Ÿ”
A
B
๐‘พ ๐Ÿ
๐‘พ ๐Ÿ
๐‘พ ๐Ÿ‘
๐‘พ ๐Ÿ’
Activation Function
Outputs
Output
F(s)s
๐‘ฟ ๐Ÿ
๐‘ฟ ๐Ÿ
Class Label
๐’€๐’‹
BA
01
1
10
00
0
11
0
0.2
0.4
0.6
0.8
1
1.2
0 0.2 0.4 0.6 0.8 1 1.2
Input Hidden
1/0
Each Hidden/Output
Layer Neuron has its
Activation Function.
๐‘พ ๐Ÿ“
๐‘พ ๐Ÿ”
A
B
๐‘พ ๐Ÿ
๐‘พ ๐Ÿ
๐‘พ ๐Ÿ‘
๐‘พ ๐Ÿ’
Activation Functions
Piecewise
Linear Sigmoid Binary
Activation Functions
Which activation function to use?
Outputs
Class
Labels
Activation
Function
TWO Class
Labels
TWO
Outputs
One that gives two outputs.
Which activation function to use?
๐‘ช๐’‹๐’€๐’‹
BA
01
1
10
00
0
11
BA
01
1 10
00
0 11
Activation Functions
Piecewise
Linear Sigmoid BinaryBinary
Activation Function
Output
F(s)s
๐‘ฟ ๐Ÿ
๐‘ฟ ๐Ÿ
bin
๐’€๐’‹
BA
01
1
10
00
0
11
0
0.2
0.4
0.6
0.8
1
1.2
0 0.2 0.4 0.6 0.8 1 1.2
1/0
Input Hidden
๐‘พ ๐Ÿ“
๐‘พ ๐Ÿ”
A
B
๐‘พ ๐Ÿ
๐‘พ ๐Ÿ
๐‘พ ๐Ÿ‘
๐‘พ ๐Ÿ’
Bias
Input Output
BA
01
1
10
00
0
11
F(s)s
๐‘ฟ ๐Ÿ
๐‘ฟ ๐Ÿ
bin
๐’€๐’‹
1/0
๐‘พ ๐Ÿ“
๐‘พ ๐Ÿ”
A
B
๐‘พ ๐Ÿ
๐‘พ ๐Ÿ
๐‘พ ๐Ÿ‘
๐‘พ ๐Ÿ’
Bias
Hidden Layer Neurons
Input Output
BA
01
1
10
00
0
11
F(s)s
๐‘ฟ ๐Ÿ
๐‘ฟ ๐Ÿ
bin
๐’€๐’‹
๐‘พ ๐Ÿ“
๐‘พ ๐Ÿ”
A
B
๐‘พ ๐Ÿ
๐‘พ ๐Ÿ
๐‘พ ๐Ÿ‘
๐‘พ ๐Ÿ’
+1
๐’ƒ ๐Ÿ
+1
๐’ƒ ๐Ÿ
1/0
Bias
Output Layer Neurons
Input Output
BA
01
1
10
00
0
11
F(s)s
๐‘ฟ ๐Ÿ
๐‘ฟ ๐Ÿ
bin
๐’€๐’‹
+1
๐’ƒ ๐Ÿ‘
1/0
๐‘พ ๐Ÿ“
๐‘พ ๐Ÿ”
A
B
๐‘พ ๐Ÿ
๐‘พ ๐Ÿ
๐‘พ ๐Ÿ‘
๐‘พ ๐Ÿ’
All Bias Values
Input Output
BA
01
1
10
00
0
11
F(s)s
๐‘ฟ ๐Ÿ
๐‘ฟ ๐Ÿ
bin
๐’€๐’‹
+1
๐’ƒ ๐Ÿ
+1
๐’ƒ ๐Ÿ
1/0
+1
๐’ƒ ๐Ÿ‘
๐‘พ ๐Ÿ“
๐‘พ ๐Ÿ”
A
B
๐‘พ ๐Ÿ
๐‘พ ๐Ÿ
๐‘พ ๐Ÿ‘
๐‘พ ๐Ÿ’
Bias
Add Bias to SOP
Input Output
BA
01
1
10
00
0
11
F(s)s
๐‘ฟ ๐Ÿ
๐‘ฟ ๐Ÿ
bin
๐’€๐’‹
+1
๐’ƒ ๐Ÿ
+1
๐’ƒ ๐Ÿ
๐‘บ ๐Ÿ=(๐‘ฟ ๐Ÿ ๐‘พ ๐Ÿ+๐‘ฟ ๐Ÿ ๐‘พ ๐Ÿ‘)
๐‘บ ๐Ÿ=(๐‘ฟ ๐Ÿ ๐‘พ ๐Ÿ+๐‘ฟ ๐Ÿ ๐‘พ ๐Ÿ’)
๐‘บ ๐Ÿ‘=(๐‘บ ๐Ÿ ๐‘พ ๐Ÿ“+๐‘บ ๐Ÿ ๐‘พ ๐Ÿ”)
1/0
+1
๐’ƒ ๐Ÿ‘
๐‘พ ๐Ÿ“
๐‘พ ๐Ÿ”
A
B
๐‘พ ๐Ÿ
๐‘พ ๐Ÿ
๐‘พ ๐Ÿ‘
๐‘พ ๐Ÿ’
Bias
Add Bias to SOP
Input Output
BA
01
1
10
00
0
11
F(s)s
๐‘ฟ ๐Ÿ
๐‘ฟ ๐Ÿ
bin
๐’€๐’‹
+1
๐’ƒ ๐Ÿ
+1
๐’ƒ ๐Ÿ
๐‘บ ๐Ÿ=(๐‘ฟ ๐Ÿ ๐‘พ ๐Ÿ+๐‘ฟ ๐Ÿ ๐‘พ ๐Ÿ‘)
1/0
+1
๐’ƒ ๐Ÿ‘
๐‘บ ๐Ÿ=(+๐Ÿ๐’ƒ ๐Ÿ+๐‘ฟ ๐Ÿ ๐‘พ ๐Ÿ+๐‘ฟ ๐Ÿ ๐‘พ ๐Ÿ‘)
๐‘พ ๐Ÿ“
๐‘พ ๐Ÿ”
A
B
๐‘พ ๐Ÿ
๐‘พ ๐Ÿ
๐‘พ ๐Ÿ‘
๐‘พ ๐Ÿ’
Bias
Add Bias to SOP
Input Output
BA
01
1
10
00
0
11
F(s)s
๐‘ฟ ๐Ÿ
๐‘ฟ ๐Ÿ
bin
๐’€๐’‹
+1
๐’ƒ ๐Ÿ
+1
๐’ƒ ๐Ÿ
๐‘บ ๐Ÿ=(๐‘ฟ ๐Ÿ ๐‘พ ๐Ÿ+๐‘ฟ ๐Ÿ ๐‘พ ๐Ÿ’)
1/0
+1
๐’ƒ ๐Ÿ‘
๐‘บ ๐Ÿ=(+๐Ÿ๐’ƒ ๐Ÿ+๐‘ฟ ๐Ÿ ๐‘พ ๐Ÿ+๐‘ฟ ๐Ÿ ๐‘พ ๐Ÿ’)
๐‘พ ๐Ÿ“
๐‘พ ๐Ÿ”
A
B
๐‘พ ๐Ÿ
๐‘พ ๐Ÿ
๐‘พ ๐Ÿ‘
๐‘พ ๐Ÿ’
Bias
Add Bias to SOP
Input Output
BA
01
1
10
00
0
11
F(s)s
๐‘ฟ ๐Ÿ
๐‘ฟ ๐Ÿ
bin
๐’€๐’‹
+1
๐’ƒ ๐Ÿ
+1
๐’ƒ ๐Ÿ
๐‘บ ๐Ÿ‘=(๐‘บ ๐Ÿ ๐‘พ ๐Ÿ“+๐‘บ ๐Ÿ ๐‘พ ๐Ÿ”)
1/0
+1
๐’ƒ ๐Ÿ‘
๐‘บ ๐Ÿ‘=(+๐Ÿ๐’ƒ ๐Ÿ‘+๐‘บ ๐Ÿ ๐‘พ ๐Ÿ“+๐‘บ ๐Ÿ ๐‘พ ๐Ÿ”)
๐‘พ ๐Ÿ“
๐‘พ ๐Ÿ”
A
B
๐‘พ ๐Ÿ
๐‘พ ๐Ÿ
๐‘พ ๐Ÿ‘
๐‘พ ๐Ÿ’
Bias Importance
Input Output
BA
01
1
10
00
0
11
F(s)s
๐‘ฟ ๐Ÿ
๐‘ฟ ๐Ÿ
bin
๐’€๐’‹
+1
๐’ƒ ๐Ÿ
+1
๐’ƒ ๐Ÿ
1/0
+1
๐’ƒ ๐Ÿ‘
๐‘พ ๐Ÿ“
๐‘พ ๐Ÿ”
A
B
๐‘พ ๐Ÿ
๐‘พ ๐Ÿ
๐‘พ ๐Ÿ‘
๐‘พ ๐Ÿ’
Bias Importance
Input Output
X
Y
BA
01
1
10
00
0
11
F(s)s
๐‘ฟ ๐Ÿ
๐‘ฟ ๐Ÿ
bin
๐’€๐’‹
+1
๐’ƒ ๐Ÿ
+1
๐’ƒ ๐Ÿ
1/0
+1
๐’ƒ ๐Ÿ‘
๐‘พ ๐Ÿ“
๐‘พ ๐Ÿ”
A
B
๐‘พ ๐Ÿ
๐‘พ ๐Ÿ
๐‘พ ๐Ÿ‘
๐‘พ ๐Ÿ’
Bias Importance
Input Output
X
Y y=ax+b
BA
01
1
10
00
0
11
F(s)s
๐‘ฟ ๐Ÿ
๐‘ฟ ๐Ÿ
bin
๐’€๐’‹
+1
๐’ƒ ๐Ÿ
+1
๐’ƒ ๐Ÿ
1/0
+1
๐’ƒ ๐Ÿ‘
๐‘พ ๐Ÿ“
๐‘พ ๐Ÿ”
A
B
๐‘พ ๐Ÿ
๐‘พ ๐Ÿ
๐‘พ ๐Ÿ‘
๐‘พ ๐Ÿ’
Bias Importance
Input Output
X
Y y=ax+b
BA
01
1
10
00
0
11
F(s)s
๐‘ฟ ๐Ÿ
๐‘ฟ ๐Ÿ
bin
๐’€๐’‹
+1
๐’ƒ ๐Ÿ
+1
๐’ƒ ๐Ÿ
1/0
+1
๐’ƒ ๐Ÿ‘
๐‘พ ๐Ÿ“
๐‘พ ๐Ÿ”
A
B
๐‘พ ๐Ÿ
๐‘พ ๐Ÿ
๐‘พ ๐Ÿ‘
๐‘พ ๐Ÿ’
Bias Importance
Input Output
X
Y y=ax+b
Y-Intercept
BA
01
1
10
00
0
11
F(s)s
๐‘ฟ ๐Ÿ
๐‘ฟ ๐Ÿ
bin
๐’€๐’‹
+1
๐’ƒ ๐Ÿ
+1
๐’ƒ ๐Ÿ
1/0
+1
๐’ƒ ๐Ÿ‘
๐‘พ ๐Ÿ“
๐‘พ ๐Ÿ”
A
B
๐‘พ ๐Ÿ
๐‘พ ๐Ÿ
๐‘พ ๐Ÿ‘
๐‘พ ๐Ÿ’
Bias Importance
Input Output
X
Y y=ax+b
Y-Intercept
b=0
BA
01
1
10
00
0
11
F(s)s
๐‘ฟ ๐Ÿ
๐‘ฟ ๐Ÿ
bin
๐’€๐’‹
+1
๐’ƒ ๐Ÿ
+1
๐’ƒ ๐Ÿ
1/0
+1
๐’ƒ ๐Ÿ‘
๐‘พ ๐Ÿ“
๐‘พ ๐Ÿ”
A
B
๐‘พ ๐Ÿ
๐‘พ ๐Ÿ
๐‘พ ๐Ÿ‘
๐‘พ ๐Ÿ’
Bias Importance
Input Output
X
Y
Y-Intercept
b=0
BA
01
1
10
00
0
11
F(s)s
๐‘ฟ ๐Ÿ
๐‘ฟ ๐Ÿ
bin
๐’€๐’‹
+1
๐’ƒ ๐Ÿ
+1
๐’ƒ ๐Ÿ
1/0
+1
๐’ƒ ๐Ÿ‘
๐‘พ ๐Ÿ“
๐‘พ ๐Ÿ”
A
B
๐‘พ ๐Ÿ
๐‘พ ๐Ÿ
๐‘พ ๐Ÿ‘
๐‘พ ๐Ÿ’
y=ax+b
Bias Importance
Input Output
X
Y
Y-Intercept
b=+v
BA
01
1
10
00
0
11
F(s)s
๐‘ฟ ๐Ÿ
๐‘ฟ ๐Ÿ
bin
๐’€๐’‹
+1
๐’ƒ ๐Ÿ
+1
๐’ƒ ๐Ÿ
1/0
+1
๐’ƒ ๐Ÿ‘
๐‘พ ๐Ÿ“
๐‘พ ๐Ÿ”
A
B
๐‘พ ๐Ÿ
๐‘พ ๐Ÿ
๐‘พ ๐Ÿ‘
๐‘พ ๐Ÿ’
y=ax+b
Bias Importance
Input Output
X
Y
Y-Intercept
b=+v
BA
01
1
10
00
0
11
F(s)s
๐‘ฟ ๐Ÿ
๐‘ฟ ๐Ÿ
bin
๐’€๐’‹
+1
๐’ƒ ๐Ÿ
+1
๐’ƒ ๐Ÿ
1/0
+1
๐’ƒ ๐Ÿ‘
๐‘พ ๐Ÿ“
๐‘พ ๐Ÿ”
A
B
๐‘พ ๐Ÿ
๐‘พ ๐Ÿ
๐‘พ ๐Ÿ‘
๐‘พ ๐Ÿ’
y=ax+b
Bias Importance
Input Output
X
Y
Y-Intercept
b=-v
BA
01
1
10
00
0
11
F(s)s
๐‘ฟ ๐Ÿ
๐‘ฟ ๐Ÿ
bin
๐’€๐’‹
+1
๐’ƒ ๐Ÿ
+1
๐’ƒ ๐Ÿ
1/0
+1
๐’ƒ ๐Ÿ‘
๐‘พ ๐Ÿ“
๐‘พ ๐Ÿ”
A
B
๐‘พ ๐Ÿ
๐‘พ ๐Ÿ
๐‘พ ๐Ÿ‘
๐‘พ ๐Ÿ’
y=ax+b
Bias Importance
Input Output
X
Y
Y-Intercept
b=+v
BA
01
1
10
00
0
11
F(s)s
๐‘ฟ ๐Ÿ
๐‘ฟ ๐Ÿ
bin
๐’€๐’‹
+1
๐’ƒ ๐Ÿ
+1
๐’ƒ ๐Ÿ
1/0
+1
๐’ƒ ๐Ÿ‘
๐‘พ ๐Ÿ“
๐‘พ ๐Ÿ”
A
B
๐‘พ ๐Ÿ
๐‘พ ๐Ÿ
๐‘พ ๐Ÿ‘
๐‘พ ๐Ÿ’
y=ax+b
Bias Importance
Input Output
Same Concept Applies to Bias
S= ๐Ÿ
๐’Ž
๐‘ฟ๐’Š ๐‘พ๐’Š+BIAS
BA
01
1
10
00
0
11
F(s)s
๐‘ฟ ๐Ÿ
๐‘ฟ ๐Ÿ
bin
๐’€๐’‹
+1
๐’ƒ ๐Ÿ
+1
๐’ƒ ๐Ÿ
1/0
+1
๐’ƒ ๐Ÿ‘
๐‘พ ๐Ÿ“
๐‘พ ๐Ÿ”
A
B
๐‘พ ๐Ÿ
๐‘พ ๐Ÿ
๐‘พ ๐Ÿ‘
๐‘พ ๐Ÿ’
Bias Importance
Input Output
S= ๐Ÿ
๐’Ž
๐‘ฟ๐’Š ๐‘พ๐’Š+BIAS
BA
01
1
10
00
0
11
F(s)s
๐‘ฟ ๐Ÿ
๐‘ฟ ๐Ÿ
bin
๐’€๐’‹
+1
๐’ƒ ๐Ÿ
+1
๐’ƒ ๐Ÿ
1/0
+1
๐’ƒ ๐Ÿ‘
๐‘พ ๐Ÿ“
๐‘พ ๐Ÿ”
A
B
๐‘พ ๐Ÿ
๐‘พ ๐Ÿ
๐‘พ ๐Ÿ‘
๐‘พ ๐Ÿ’
Bias Importance
Input Output
S= ๐Ÿ
๐’Ž
๐‘ฟ๐’Š ๐‘พ๐’Š+BIAS
BA
01
1
10
00
0
11
F(s)s
๐‘ฟ ๐Ÿ
๐‘ฟ ๐Ÿ
bin
๐’€๐’‹
+1
๐’ƒ ๐Ÿ
+1
๐’ƒ ๐Ÿ
1/0
+1
๐’ƒ ๐Ÿ‘
๐‘พ ๐Ÿ“
๐‘พ ๐Ÿ”
A
B
๐‘พ ๐Ÿ
๐‘พ ๐Ÿ
๐‘พ ๐Ÿ‘
๐‘พ ๐Ÿ’
Bias Importance
Input Output
S= ๐Ÿ
๐’Ž
๐‘ฟ๐’Š ๐‘พ๐’Š+BIAS
BA
01
1
10
00
0
11
F(s)s
๐‘ฟ ๐Ÿ
๐‘ฟ ๐Ÿ
bin
๐’€๐’‹
+1
๐’ƒ ๐Ÿ
+1
๐’ƒ ๐Ÿ
1/0
+1
๐’ƒ ๐Ÿ‘
๐‘พ ๐Ÿ“
๐‘พ ๐Ÿ”
A
B
๐‘พ ๐Ÿ
๐‘พ ๐Ÿ
๐‘พ ๐Ÿ‘
๐‘พ ๐Ÿ’
Bias Importance
Input Output
S= ๐Ÿ
๐’Ž
๐‘ฟ๐’Š ๐‘พ๐’Š+BIAS
BA
01
1
10
00
0
11
F(s)s
๐‘ฟ ๐Ÿ
๐‘ฟ ๐Ÿ
bin
๐’€๐’‹
+1
๐’ƒ ๐Ÿ
+1
๐’ƒ ๐Ÿ
1/0
+1
๐’ƒ ๐Ÿ‘
๐‘พ ๐Ÿ“
๐‘พ ๐Ÿ”
A
B
๐‘พ ๐Ÿ
๐‘พ ๐Ÿ
๐‘พ ๐Ÿ‘
๐‘พ ๐Ÿ’
Learning Rate
๐ŸŽ โ‰ค ฮท โ‰ค ๐Ÿ
F(s)s
๐‘ฟ ๐Ÿ
๐‘ฟ ๐Ÿ
bin
๐’€๐’‹
+1
๐’ƒ ๐Ÿ
+1
๐’ƒ ๐Ÿ
1/0
+1
๐’ƒ ๐Ÿ‘
๐‘พ ๐Ÿ“
๐‘พ ๐Ÿ”
A
B
๐‘พ ๐Ÿ
๐‘พ ๐Ÿ
๐‘พ ๐Ÿ‘
๐‘พ ๐Ÿ’
Summary of Parameters
Inputs ๐‘ฟ ๐’Ž
๐ŸŽ โ‰ค ฮท โ‰ค ๐Ÿ
๐‘ฟ(๐’)=(๐‘ฟ ๐ŸŽ, ๐‘ฟ ๐Ÿ,๐‘ฟ ๐Ÿ, โ€ฆ)
F(s)s
๐‘ฟ ๐Ÿ
๐‘ฟ ๐Ÿ
bin
๐’€๐’‹
+1
๐’ƒ ๐Ÿ
+1
๐’ƒ ๐Ÿ
1/0
+1
๐’ƒ ๐Ÿ‘
๐‘พ ๐Ÿ“
๐‘พ ๐Ÿ”
A
B
๐‘พ ๐Ÿ
๐‘พ ๐Ÿ
๐‘พ ๐Ÿ‘
๐‘พ ๐Ÿ’
Summary of Parameters
Weights ๐‘พ ๐’Ž
๐ŸŽ โ‰ค ฮท โ‰ค ๐Ÿ
W(๐’)=(๐‘พ ๐ŸŽ, ๐‘พ ๐Ÿ,๐‘พ ๐Ÿ, โ€ฆ)
๐‘ฟ(๐’)=(๐‘ฟ ๐ŸŽ, ๐‘ฟ ๐Ÿ,๐‘ฟ ๐Ÿ, โ€ฆ)
F(s)s
๐‘ฟ ๐Ÿ
๐‘ฟ ๐Ÿ
bin
๐’€๐’‹
+1
๐’ƒ ๐Ÿ
+1
๐’ƒ ๐Ÿ
1/0
+1
๐’ƒ ๐Ÿ‘
๐‘พ ๐Ÿ“
๐‘พ ๐Ÿ”
A
B
๐‘พ ๐Ÿ
๐‘พ ๐Ÿ
๐‘พ ๐Ÿ‘
๐‘พ ๐Ÿ’
Summary of Parameters
Bias
๐ŸŽ โ‰ค ฮท โ‰ค ๐Ÿ
๐‘ฟ(๐’)=(๐‘ฟ ๐ŸŽ, ๐‘ฟ ๐Ÿ,๐‘ฟ ๐Ÿ, โ€ฆ)
W(๐’)=(๐‘พ ๐ŸŽ, ๐‘พ ๐Ÿ,๐‘พ ๐Ÿ, โ€ฆ)
F(s)s
๐‘ฟ ๐Ÿ
๐‘ฟ ๐Ÿ
bin
๐’€๐’‹
+1
๐’ƒ ๐Ÿ
+1
๐’ƒ ๐Ÿ
1/0
+1
๐’ƒ ๐Ÿ‘
๐‘พ ๐Ÿ“
๐‘พ ๐Ÿ”
A
B
๐‘พ ๐Ÿ
๐‘พ ๐Ÿ
๐‘พ ๐Ÿ‘
๐‘พ ๐Ÿ’
Summary of Parameters
Sum Of Products (SOP) ๐’”
s=(๐‘ฟ ๐ŸŽ ๐‘พ ๐ŸŽ+๐‘ฟ ๐Ÿ ๐‘พ ๐Ÿ+๐‘ฟ ๐Ÿ ๐‘พ ๐Ÿ+โ€ฆ)
๐ŸŽ โ‰ค ฮท โ‰ค ๐Ÿ
๐‘ฟ(๐’)=(๐‘ฟ ๐ŸŽ, ๐‘ฟ ๐Ÿ,๐‘ฟ ๐Ÿ, โ€ฆ)
W(๐’)=(๐‘พ ๐ŸŽ, ๐‘พ ๐Ÿ,๐‘พ ๐Ÿ, โ€ฆ)
F(s)s
๐‘ฟ ๐Ÿ
๐‘ฟ ๐Ÿ
bin
๐’€๐’‹
+1
๐’ƒ ๐Ÿ
+1
๐’ƒ ๐Ÿ
1/0
+1
๐’ƒ ๐Ÿ‘
๐‘พ ๐Ÿ“
๐‘พ ๐Ÿ”
A
B
๐‘พ ๐Ÿ
๐‘พ ๐Ÿ
๐‘พ ๐Ÿ‘
๐‘พ ๐Ÿ’
Summary of Parameters
Activation Function ๐’ƒ๐’Š๐’
s=(๐‘ฟ ๐ŸŽ ๐‘พ ๐ŸŽ+๐‘ฟ ๐Ÿ ๐‘พ ๐Ÿ+๐‘ฟ ๐Ÿ ๐‘พ ๐Ÿ+โ€ฆ)
๐ŸŽ โ‰ค ฮท โ‰ค ๐Ÿ
๐‘ฟ(๐’)=(๐‘ฟ ๐ŸŽ, ๐‘ฟ ๐Ÿ,๐‘ฟ ๐Ÿ, โ€ฆ)
W(๐’)=(๐‘พ ๐ŸŽ, ๐‘พ ๐Ÿ,๐‘พ ๐Ÿ, โ€ฆ)
F(s)s
๐‘ฟ ๐Ÿ
๐‘ฟ ๐Ÿ
bin
๐’€๐’‹
+1
๐’ƒ ๐Ÿ
+1
๐’ƒ ๐Ÿ
1/0
+1
๐’ƒ ๐Ÿ‘
๐‘พ ๐Ÿ“
๐‘พ ๐Ÿ”
A
B
๐‘พ ๐Ÿ
๐‘พ ๐Ÿ
๐‘พ ๐Ÿ‘
๐‘พ ๐Ÿ’
Summary of Parameters
Outputs ๐’€๐’‹
F(s)s
๐‘ฟ ๐Ÿ
๐‘ฟ ๐Ÿ
bin
๐’€๐’‹
+1
๐’ƒ ๐Ÿ
+1
๐’ƒ ๐Ÿ
1/0
+1
๐’ƒ ๐Ÿ‘
๐‘พ ๐Ÿ“
๐‘พ ๐Ÿ”
A
B
๐‘พ ๐Ÿ
๐‘พ ๐Ÿ
๐‘พ ๐Ÿ‘
๐‘พ ๐Ÿ’
s=(๐‘ฟ ๐ŸŽ ๐‘พ ๐ŸŽ+๐‘ฟ ๐Ÿ ๐‘พ ๐Ÿ+๐‘ฟ ๐Ÿ ๐‘พ ๐Ÿ+โ€ฆ)
๐ŸŽ โ‰ค ฮท โ‰ค ๐Ÿ
๐‘ฟ(๐’)=(๐‘ฟ ๐ŸŽ, ๐‘ฟ ๐Ÿ,๐‘ฟ ๐Ÿ, โ€ฆ)
W(๐’)=(๐‘พ ๐ŸŽ, ๐‘พ ๐Ÿ,๐‘พ ๐Ÿ, โ€ฆ)
Summary of Parameters
Learning Rate ฮท
๐ŸŽ โ‰ค ฮท โ‰ค ๐Ÿ
F(s)s
๐‘ฟ ๐Ÿ
๐‘ฟ ๐Ÿ
bin
๐’€๐’‹
+1
๐’ƒ ๐Ÿ
+1
๐’ƒ ๐Ÿ
1/0
+1
๐’ƒ ๐Ÿ‘
๐‘พ ๐Ÿ“
๐‘พ ๐Ÿ”
A
B
๐‘พ ๐Ÿ
๐‘พ ๐Ÿ
๐‘พ ๐Ÿ‘
๐‘พ ๐Ÿ’
s=(๐‘ฟ ๐ŸŽ ๐‘พ ๐ŸŽ+๐‘ฟ ๐Ÿ ๐‘พ ๐Ÿ+๐‘ฟ ๐Ÿ ๐‘พ ๐Ÿ+โ€ฆ)
๐‘ฟ(๐’)=(๐‘ฟ ๐ŸŽ, ๐‘ฟ ๐Ÿ,๐‘ฟ ๐Ÿ, โ€ฆ)
W(๐’)=(๐‘พ ๐ŸŽ, ๐‘พ ๐Ÿ,๐‘พ ๐Ÿ, โ€ฆ)
Other Parameters
Step n
๐’ = ๐ŸŽ, ๐Ÿ, ๐Ÿ, โ€ฆ
F(s)s
๐‘ฟ ๐Ÿ
๐‘ฟ ๐Ÿ
bin
๐’€๐’‹
+1
๐’ƒ ๐Ÿ
+1
๐’ƒ ๐Ÿ
1/0
+1
๐’ƒ ๐Ÿ‘
๐‘พ ๐Ÿ“
๐‘พ ๐Ÿ”
A
B
๐‘พ ๐Ÿ
๐‘พ ๐Ÿ
๐‘พ ๐Ÿ‘
๐‘พ ๐Ÿ’
s=(๐‘ฟ ๐ŸŽ ๐‘พ ๐ŸŽ+๐‘ฟ ๐Ÿ ๐‘พ ๐Ÿ+๐‘ฟ ๐Ÿ ๐‘พ ๐Ÿ+โ€ฆ)
๐ŸŽ โ‰ค ฮท โ‰ค ๐Ÿ
๐‘ฟ(๐’)=(๐‘ฟ ๐ŸŽ, ๐‘ฟ ๐Ÿ,๐‘ฟ ๐Ÿ, โ€ฆ)
W(๐’)=(๐‘พ ๐ŸŽ, ๐‘พ ๐Ÿ,๐‘พ ๐Ÿ, โ€ฆ)
Other Parameters
Desired Output ๐’…๐’‹
๐’ = ๐ŸŽ, ๐Ÿ, ๐Ÿ, โ€ฆ
๐’… ๐’ =
๐Ÿ, ๐’™ ๐’ ๐’ƒ๐’†๐’๐’๐’๐’ˆ๐’” ๐’•๐’ ๐‘ช๐Ÿ (๐Ÿ)
๐ŸŽ, ๐’™ ๐’ ๐’ƒ๐’†๐’๐’๐’๐’ˆ๐’” ๐’•๐’ ๐‘ช๐Ÿ (๐ŸŽ)
BA
01
1
10
00
0
11
F(s)s
๐‘ฟ ๐Ÿ
๐‘ฟ ๐Ÿ
bin
๐’€๐’‹
+1
๐’ƒ ๐Ÿ
+1
๐’ƒ ๐Ÿ
1/0
+1
๐’ƒ ๐Ÿ‘
๐‘พ ๐Ÿ“
๐‘พ ๐Ÿ”
A
B
๐‘พ ๐Ÿ
๐‘พ ๐Ÿ
๐‘พ ๐Ÿ‘
๐‘พ ๐Ÿ’
s=(๐‘ฟ ๐ŸŽ ๐‘พ ๐ŸŽ+๐‘ฟ ๐Ÿ ๐‘พ ๐Ÿ+๐‘ฟ ๐Ÿ ๐‘พ ๐Ÿ+โ€ฆ)
๐ŸŽ โ‰ค ฮท โ‰ค ๐Ÿ
๐‘ฟ(๐’)=(๐‘ฟ ๐ŸŽ, ๐‘ฟ ๐Ÿ,๐‘ฟ ๐Ÿ, โ€ฆ)
W(๐’)=(๐‘พ ๐ŸŽ, ๐‘พ ๐Ÿ,๐‘พ ๐Ÿ, โ€ฆ)
Neural Networks Training Steps
Weights Initialization
Inputs Application
Sum of Inputs-Weights Products
Activation Function Response Calculation
Weights Adaptation
Back to Step 2
1
2
3
4
5
6
Regarding 5th Step: Weights Adaptation
โ€ข If the predicted output Y is not the same as the desired output d,
then weights are to be adapted according to the following equation:
๐‘พ ๐’ + ๐Ÿ = ๐‘พ ๐’ + ฮท ๐’… ๐’ โˆ’ ๐’€ ๐’ ๐‘ฟ(๐’)
Where
๐‘พ ๐’ = [๐’ƒ ๐’ , ๐‘พ ๐Ÿ(๐’), ๐‘พ ๐Ÿ(๐’), ๐‘พ ๐Ÿ‘(๐’), โ€ฆ , ๐‘พ ๐’Ž(๐’)]
Neural Networks
Training Example
Step n=0
โ€ข In each step in the solution, the parameters of the neural network
must be known.
โ€ข Parameters of step n=0:
ฮท = .001
๐‘‹ ๐‘› = ๐‘‹ 0 = +1, +1, +1,1, 0
๐‘Š ๐‘› = ๐‘Š 0 = ๐‘1, ๐‘2, ๐‘3, ๐‘ค1, ๐‘ค2, ๐‘ค3, ๐‘ค4, ๐‘ค5, ๐‘ค6
= โˆ’1.5, โˆ’.5, โˆ’.5, 1, 1, 1, 1, โˆ’2, 1
๐‘‘ ๐‘› = ๐‘‘ 0 = 1
BA
01
1 => 1
10
00
0 => 0
11
Neural Networks
Training Example
Step n=0
s
๐‘ฟ ๐Ÿ
๐‘ฟ ๐Ÿ
๐’€๐’‹
+1
โˆ’๐Ÿ. ๐Ÿ“
+1
โˆ’. ๐Ÿ“
1/0
+1
โˆ’. ๐Ÿ“
โˆ’2
+๐Ÿ
A
B
+๐Ÿ
+๐Ÿ
+๐Ÿ
+๐Ÿ
bin
BA
01
1 => 1
10
00
0 => 0
11
Neural Networks
Training Example
Step n=0 โ€“ SOP โ€“ ๐‘บ ๐Ÿ
๐‘บ ๐Ÿ=(+๐Ÿ๐’ƒ ๐Ÿ+๐‘ฟ ๐Ÿ ๐‘พ ๐Ÿ+๐‘ฟ ๐Ÿ ๐‘พ ๐Ÿ‘)
=+1*-1.5+1*1+0*1
=-.5
BA
01
1 => 1
10
00
0 => 0
11
s
๐‘ฟ ๐Ÿ
๐‘ฟ ๐Ÿ
๐’€๐’‹
+1
โˆ’๐Ÿ. ๐Ÿ“
+1
โˆ’. ๐Ÿ“
1/0
+1
โˆ’. ๐Ÿ“
โˆ’2
+๐Ÿ
A
B
+๐Ÿ
+๐Ÿ
+๐Ÿ
+๐Ÿ
bin
Neural Networks
Training Example
Step n=0 โ€“ Output โ€“ ๐‘บ ๐Ÿ
๐’€ ๐‘บ ๐Ÿ =
= ๐‘ฉ๐‘ฐ๐‘ต ๐‘บ ๐Ÿ
= ๐‘ฉ๐‘ฐ๐‘ต โˆ’. ๐Ÿ“
= ๐ŸŽ
๐’ƒ๐’Š๐’ ๐’” =
+๐Ÿ, ๐’” โ‰ฅ ๐ŸŽ
๐ŸŽ, ๐’” < ๐ŸŽ
BA
01
1 => 1
10
00
0 => 0
11
s
๐‘ฟ ๐Ÿ
๐‘ฟ ๐Ÿ
๐’€๐’‹
+1
โˆ’๐Ÿ. ๐Ÿ“
+1
โˆ’. ๐Ÿ“
1/0
+1
โˆ’. ๐Ÿ“
โˆ’2
+๐Ÿ
A
B
+๐Ÿ
+๐Ÿ
+๐Ÿ
+๐Ÿ
bin
Neural Networks
Training Example
Step n=0 โ€“ SOP โ€“ ๐‘บ ๐Ÿ
๐‘บ ๐Ÿ=(+๐Ÿ๐’ƒ ๐Ÿ+๐‘ฟ ๐Ÿ ๐‘พ ๐Ÿ+๐‘ฟ ๐Ÿ ๐‘พ ๐Ÿ’)
=+1*-.5+1*1+0*1
=.5
BA
01
1 => 1
10
00
0 => 0
11
s
๐‘ฟ ๐Ÿ
๐‘ฟ ๐Ÿ
๐’€๐’‹
+1
โˆ’๐Ÿ. ๐Ÿ“
+1
โˆ’. ๐Ÿ“
1/0
+1
โˆ’. ๐Ÿ“
โˆ’2
+๐Ÿ
A
B
+๐Ÿ
+๐Ÿ
+๐Ÿ
+๐Ÿ
bin
Neural Networks
Training Example
Step n=0 โ€“ Output โ€“ ๐‘บ ๐Ÿ
๐’€ ๐‘บ2 =
= ๐‘ฉ๐‘ฐ๐‘ต ๐‘บ2
= ๐‘ฉ๐‘ฐ๐‘ต . ๐Ÿ“
= 1
๐’ƒ๐’Š๐’ ๐’” =
+๐Ÿ, ๐’” โ‰ฅ ๐ŸŽ
โˆ’๐Ÿ, ๐’” < ๐ŸŽ
BA
01
1 => 1
10
00
0 => 0
11
s
๐‘ฟ ๐Ÿ
๐‘ฟ ๐Ÿ
๐’€๐’‹
+1
โˆ’๐Ÿ. ๐Ÿ“
+1
โˆ’. ๐Ÿ“
1/0
+1
โˆ’. ๐Ÿ“
โˆ’2
+๐Ÿ
A
B
+๐Ÿ
+๐Ÿ
+๐Ÿ
+๐Ÿ
bin
Neural Networks
Training Example
Step n=0 โ€“ SOP โ€“ ๐‘บ ๐Ÿ‘
๐‘บ ๐Ÿ‘=(+๐Ÿ๐’ƒ ๐Ÿ‘+๐‘บ ๐Ÿ ๐‘พ ๐Ÿ“+๐‘บ ๐Ÿ ๐‘พ ๐Ÿ”)
=+1*-.5+0*-2+1*1
=.5
BA
01
1 => 1
10
00
0 => 0
11
s
๐‘ฟ ๐Ÿ
๐‘ฟ ๐Ÿ
๐’€๐’‹
+1
โˆ’๐Ÿ. ๐Ÿ“
+1
โˆ’. ๐Ÿ“
1/0
+1
โˆ’. ๐Ÿ“
โˆ’2
+๐Ÿ
A
B
+๐Ÿ
+๐Ÿ
+๐Ÿ
+๐Ÿ
bin
Neural Networks
Training Example
Step n=0 โ€“ Output โ€“ ๐‘บ ๐Ÿ‘
๐’€ ๐‘บ3 =
= ๐‘ฉ๐‘ฐ๐‘ต ๐‘บ3
= ๐‘ฉ๐‘ฐ๐‘ต . ๐Ÿ“
= 1
๐’ƒ๐’Š๐’ ๐’” =
+๐Ÿ, ๐’” โ‰ฅ ๐ŸŽ
๐ŸŽ, ๐’” < ๐ŸŽ
BA
01
1 => 1
10
00
0 => 0
11
s
๐‘ฟ ๐Ÿ
๐‘ฟ ๐Ÿ
๐’€๐’‹
+1
โˆ’๐Ÿ. ๐Ÿ“
+1
โˆ’. ๐Ÿ“
1/0
+1
โˆ’. ๐Ÿ“
โˆ’2
+๐Ÿ
A
B
+๐Ÿ
+๐Ÿ
+๐Ÿ
+๐Ÿ
bin
Neural Networks
Training Example
Step n=0 - Output
๐’€ ๐’ = ๐’€ ๐ŸŽ = ๐’€ ๐‘บ3
= 1
BA
01
1 => 1
10
00
0 => 0
11
s
๐‘ฟ ๐Ÿ
๐‘ฟ ๐Ÿ
๐’€๐’‹
+1
โˆ’๐Ÿ. ๐Ÿ“
+1
โˆ’. ๐Ÿ“
1/0
+1
โˆ’. ๐Ÿ“
โˆ’2
+๐Ÿ
A
B
+๐Ÿ
+๐Ÿ
+๐Ÿ
+๐Ÿ
bin
Neural Networks
Training Example
Step n=0
Predicted Vs. Desired
๐’€ ๐’ = ๐’€ ๐ŸŽ = 1
๐ ๐’ = ๐’… ๐ŸŽ = 1
โˆต ๐’€ ๐’ = ๐’… ๐’
โˆด Weights are Correct.
No Adaptation
BA
01
1 => 1
10
00
0 => 0
11
s
๐‘ฟ ๐Ÿ
๐‘ฟ ๐Ÿ
๐’€๐’‹
+1
โˆ’๐Ÿ. ๐Ÿ“
+1
โˆ’. ๐Ÿ“
1/0
+1
โˆ’. ๐Ÿ“
โˆ’2
+๐Ÿ
A
B
+๐Ÿ
+๐Ÿ
+๐Ÿ
+๐Ÿ
bin
Neural Networks
Training Example
Step n=1
โ€ข In each step in the solution, the parameters of the neural network
must be known.
โ€ข Parameters of step n=1:
ฮท = .001
๐‘‹ ๐‘› = ๐‘‹ 1 = +1, +1, +1,0, 1
๐‘Š ๐‘› = ๐‘Š 1 = ๐‘Š 0 = ๐‘1, ๐‘2, ๐‘3, ๐‘ค1, ๐‘ค1, ๐‘ค2, ๐‘ค3, ๐‘ค4, ๐‘ค5, ๐‘ค6
= โˆ’1.5, โˆ’.5, โˆ’.5, 1, 1, 1, 1, โˆ’2, 1
๐‘‘ ๐‘› = ๐‘‘ 1 = +1
BA
01
1 => 1
10
00
0 => 0
11
Neural Networks
Training Example
Step n=1
s
๐‘ฟ ๐Ÿ
๐‘ฟ ๐Ÿ
๐’€๐’‹
+1
โˆ’๐Ÿ. ๐Ÿ“
+1
โˆ’. ๐Ÿ“
1/0
+1
โˆ’. ๐Ÿ“
โˆ’2
+๐Ÿ
A
B
+๐Ÿ
+๐Ÿ
+๐Ÿ
+๐Ÿ
bin
BA
01
1 => 1
10
00
0 => 0
11
Neural Networks
Training Example
Step n=1 โ€“ SOP โ€“ ๐‘บ ๐Ÿ
๐‘บ ๐Ÿ=(+๐Ÿ๐’ƒ ๐Ÿ+๐‘ฟ ๐Ÿ ๐‘พ ๐Ÿ+๐‘ฟ ๐Ÿ ๐‘พ ๐Ÿ‘)
=+1*-1.5+0*1+1*1
=-.5
BA
01
1 => 1
10
00
0 => 0
11
s
๐‘ฟ ๐Ÿ
๐‘ฟ ๐Ÿ
๐’€๐’‹
+1
โˆ’๐Ÿ. ๐Ÿ“
+1
โˆ’. ๐Ÿ“
1/0
+1
โˆ’. ๐Ÿ“
โˆ’2
+๐Ÿ
A
B
+๐Ÿ
+๐Ÿ
+๐Ÿ
+๐Ÿ
bin
Neural Networks
Training Example
Step n=1 โ€“ Output โ€“ ๐‘บ ๐Ÿ
๐’€ ๐‘บ ๐Ÿ =
= ๐‘ฉ๐‘ฐ๐‘ต ๐‘บ ๐Ÿ
= ๐‘ฉ๐‘ฐ๐‘ต โˆ’. ๐Ÿ“
= ๐ŸŽ
๐’ƒ๐’Š๐’ ๐’” =
+๐Ÿ, ๐’” โ‰ฅ ๐ŸŽ
๐ŸŽ, ๐’” < ๐ŸŽ
BA
01
1 => 1
10
00
0 => 0
11
s
๐‘ฟ ๐Ÿ
๐‘ฟ ๐Ÿ
๐’€๐’‹
+1
โˆ’๐Ÿ. ๐Ÿ“
+1
โˆ’. ๐Ÿ“
1/0
+1
โˆ’. ๐Ÿ“
โˆ’2
+๐Ÿ
A
B
+๐Ÿ
+๐Ÿ
+๐Ÿ
+๐Ÿ
bin
Neural Networks
Training Example
Step n=1 โ€“ SOP โ€“ ๐‘บ ๐Ÿ
๐‘บ ๐Ÿ=(+๐Ÿ๐’ƒ ๐Ÿ+๐‘ฟ ๐Ÿ ๐‘พ ๐Ÿ+๐‘ฟ ๐Ÿ ๐‘พ ๐Ÿ’)
=+1*-.5+0*1+1*1
=.5
BA
01
1 => 1
10
00
0 => 0
11
s
๐‘ฟ ๐Ÿ
๐‘ฟ ๐Ÿ
๐’€๐’‹
+1
โˆ’๐Ÿ. ๐Ÿ“
+1
โˆ’. ๐Ÿ“
1/0
+1
โˆ’. ๐Ÿ“
โˆ’2
+๐Ÿ
A
B
+๐Ÿ
+๐Ÿ
+๐Ÿ
+๐Ÿ
bin
Neural Networks
Training Example
Step n=1 โ€“ Output โ€“ ๐‘บ ๐Ÿ
๐’€ ๐‘บ2 =
= ๐‘ฉ๐‘ฐ๐‘ต ๐‘บ2
= ๐‘ฉ๐‘ฐ๐‘ต . ๐Ÿ“
= 1
๐’ƒ๐’Š๐’ ๐’” =
+๐Ÿ, ๐’” โ‰ฅ ๐ŸŽ
โˆ’๐Ÿ, ๐’” < ๐ŸŽ
BA
01
1 => 1
10
00
0 => 0
11
s
๐‘ฟ ๐Ÿ
๐‘ฟ ๐Ÿ
๐’€๐’‹
+1
โˆ’๐Ÿ. ๐Ÿ“
+1
โˆ’. ๐Ÿ“
1/0
+1
โˆ’. ๐Ÿ“
โˆ’2
+๐Ÿ
A
B
+๐Ÿ
+๐Ÿ
+๐Ÿ
+๐Ÿ
bin
Neural Networks
Training Example
Step n=1 โ€“ SOP โ€“ ๐‘บ ๐Ÿ‘
๐‘บ ๐Ÿ‘=(+๐Ÿ๐’ƒ ๐Ÿ‘+๐‘บ ๐Ÿ ๐‘พ ๐Ÿ“+๐‘บ ๐Ÿ ๐‘พ ๐Ÿ”)
=+1*-.5+0*-2+1*1
=.5
BA
01
1 => 1
10
00
0 => 0
11
s
๐‘ฟ ๐Ÿ
๐‘ฟ ๐Ÿ
๐’€๐’‹
+1
โˆ’๐Ÿ. ๐Ÿ“
+1
โˆ’. ๐Ÿ“
1/0
+1
โˆ’. ๐Ÿ“
โˆ’2
+๐Ÿ
A
B
+๐Ÿ
+๐Ÿ
+๐Ÿ
+๐Ÿ
bin
Neural Networks
Training Example
Step n=1 โ€“ Output โ€“ ๐‘บ ๐Ÿ‘
๐’€ ๐‘บ3 =
= ๐‘ฉ๐‘ฐ๐‘ต ๐‘บ3
= ๐‘ฉ๐‘ฐ๐‘ต . ๐Ÿ“
= 1
๐’ƒ๐’Š๐’ ๐’” =
+๐Ÿ, ๐’” โ‰ฅ ๐ŸŽ
๐ŸŽ, ๐’” < ๐ŸŽ
BA
01
1 => 1
10
00
0 => 0
11
s
๐‘ฟ ๐Ÿ
๐‘ฟ ๐Ÿ
๐’€๐’‹
+1
โˆ’๐Ÿ. ๐Ÿ“
+1
โˆ’. ๐Ÿ“
1/0
+1
โˆ’. ๐Ÿ“
โˆ’2
+๐Ÿ
A
B
+๐Ÿ
+๐Ÿ
+๐Ÿ
+๐Ÿ
bin
Neural Networks
Training Example
Step n=1 - Output
๐’€ ๐’ = ๐’€ ๐Ÿ = ๐’€ ๐‘บ3
= 1
BA
01
1 => 1
10
00
0 => 0
11
s
๐‘ฟ ๐Ÿ
๐‘ฟ ๐Ÿ
๐’€๐’‹
+1
โˆ’๐Ÿ. ๐Ÿ“
+1
โˆ’. ๐Ÿ“
1/0
+1
โˆ’. ๐Ÿ“
โˆ’2
+๐Ÿ
A
B
+๐Ÿ
+๐Ÿ
+๐Ÿ
+๐Ÿ
bin
Neural Networks
Training Example
Step n=1
Predicted Vs. Desired
๐’€ ๐’ = ๐’€ ๐Ÿ = 1
๐ ๐’ = ๐’… ๐Ÿ = 1
โˆต ๐’€ ๐’ = ๐’… ๐’
โˆด Weights are Correct.
No Adaptation
BA
01
1 => 1
10
00
0 => 0
11
s
๐‘ฟ ๐Ÿ
๐‘ฟ ๐Ÿ
๐’€๐’‹
+1
โˆ’๐Ÿ. ๐Ÿ“
+1
โˆ’. ๐Ÿ“
1/0
+1
โˆ’. ๐Ÿ“
โˆ’2
+๐Ÿ
A
B
+๐Ÿ
+๐Ÿ
+๐Ÿ
+๐Ÿ
bin
Neural Networks
Training Example
Step n=2
โ€ข In each step in the solution, the parameters of the neural network
must be known.
โ€ข Parameters of step n=2:
ฮท = .001
๐‘‹ ๐‘› = ๐‘‹ 2 = +1, +1, +1,0, 0
๐‘Š ๐‘› = ๐‘Š 2 = ๐‘Š 1 = ๐‘1, ๐‘2, ๐‘3, ๐‘ค1, ๐‘ค1, ๐‘ค2, ๐‘ค3, ๐‘ค4, ๐‘ค5, ๐‘ค6
= โˆ’1.5, โˆ’.5, โˆ’.5, 1, 1, 1, 1, โˆ’2, 1
๐‘‘ ๐‘› = ๐‘‘ 2 = 0
BA
01
1 => 1
10
00
0 => 0
11
Neural Networks
Training Example
Step n=2
s
๐‘ฟ ๐Ÿ
๐‘ฟ ๐Ÿ
๐’€๐’‹
+1
โˆ’๐Ÿ. ๐Ÿ“
+1
โˆ’. ๐Ÿ“
1/0
+1
โˆ’. ๐Ÿ“
โˆ’๐Ÿ
+๐Ÿ
A
B
+๐Ÿ
+๐Ÿ
+๐Ÿ
+๐Ÿ
bin
BA
01
1 => 1
10
00
0 => 0
11
Neural Networks
Training Example
Step n=2 โ€“ SOP โ€“ ๐‘บ ๐Ÿ
๐‘บ ๐Ÿ=(+๐Ÿ๐’ƒ ๐Ÿ+๐‘ฟ ๐Ÿ ๐‘พ ๐Ÿ+๐‘ฟ ๐Ÿ ๐‘พ ๐Ÿ‘)
=+1*-1.5+0*1+0*1
=-1.5
BA
01
1 => 1
10
00
0 => 0
11
s
๐‘ฟ ๐Ÿ
๐‘ฟ ๐Ÿ
๐’€๐’‹
+1
โˆ’๐Ÿ. ๐Ÿ“
+1
โˆ’. ๐Ÿ“
1/0
+1
โˆ’. ๐Ÿ“
โˆ’๐Ÿ
+๐Ÿ
A
B
+๐Ÿ
+๐Ÿ
+๐Ÿ
+๐Ÿ
bin
Neural Networks
Training Example
Step n=2 โ€“ Output โ€“ ๐‘บ ๐Ÿ
๐’€ ๐‘บ ๐Ÿ =
= ๐‘ฉ๐‘ฐ๐‘ต ๐‘บ ๐Ÿ
= ๐‘ฉ๐‘ฐ๐‘ต โˆ’๐Ÿ. ๐Ÿ“
= ๐ŸŽ
๐’ƒ๐’Šn ๐’” =
+๐Ÿ, ๐’” โ‰ฅ ๐ŸŽ
๐ŸŽ, ๐’” < ๐ŸŽ
BA
01
1 => 1
10
00
0 => 0
11
s
๐‘ฟ ๐Ÿ
๐‘ฟ ๐Ÿ
๐’€๐’‹
+1
โˆ’๐Ÿ. ๐Ÿ“
+1
โˆ’. ๐Ÿ“
1/0
+1
โˆ’. ๐Ÿ“
โˆ’๐Ÿ
+๐Ÿ
A
B
+๐Ÿ
+๐Ÿ
+๐Ÿ
+๐Ÿ
bin
Neural Networks
Training Example
Step n=2 โ€“ SOP โ€“ ๐‘บ ๐Ÿ
๐‘บ ๐Ÿ=(+๐Ÿ๐’ƒ ๐Ÿ+๐‘ฟ ๐Ÿ ๐‘พ ๐Ÿ+๐‘ฟ ๐Ÿ ๐‘พ ๐Ÿ’)
=+1*-.5+0*1+0*1
=-.5
BA
01
1 => 1
10
00
0 => 0
11
s
๐‘ฟ ๐Ÿ
๐‘ฟ ๐Ÿ
๐’€๐’‹
+1
โˆ’๐Ÿ. ๐Ÿ“
+1
โˆ’. ๐Ÿ“
1/0
+1
โˆ’. ๐Ÿ“
โˆ’๐Ÿ
+๐Ÿ
A
B
+๐Ÿ
+๐Ÿ
+๐Ÿ
+๐Ÿ
bin
Neural Networks
Training Example
Step n=2 โ€“ Output โ€“ ๐‘บ ๐Ÿ
๐’€ ๐‘บ2 =
= ๐‘บ๐‘ฎ๐‘ต ๐‘บ2
= ๐‘บ๐‘ฎ๐‘ต โˆ’. ๐Ÿ“
=0
๐’ƒ๐’Š๐’ ๐’” =
+๐Ÿ, ๐’” โ‰ฅ ๐ŸŽ
โˆ’๐Ÿ, ๐’” < ๐ŸŽ
BA
01
1 => 1
10
00
0 => 0
11
s
๐‘ฟ ๐Ÿ
๐‘ฟ ๐Ÿ
๐’€๐’‹
+1
โˆ’๐Ÿ. ๐Ÿ“
+1
โˆ’. ๐Ÿ“
1/0
+1
โˆ’. ๐Ÿ“
โˆ’๐Ÿ
+๐Ÿ
A
B
+๐Ÿ
+๐Ÿ
+๐Ÿ
+๐Ÿ
bin
Neural Networks
Training Example
Step n=2 โ€“ SOP โ€“ ๐‘บ ๐Ÿ‘
๐‘บ ๐Ÿ‘=(+๐Ÿ๐’ƒ ๐Ÿ‘+๐‘บ ๐Ÿ ๐‘พ ๐Ÿ“+๐‘บ ๐Ÿ ๐‘พ ๐Ÿ”)
=+1*-.5+0*-2+0*1
=-.5
BA
01
1 => 1
10
00
0 => 0
11
s
๐‘ฟ ๐Ÿ
๐‘ฟ ๐Ÿ
๐’€๐’‹
+1
โˆ’๐Ÿ. ๐Ÿ“
+1
โˆ’. ๐Ÿ“
1/0
+1
โˆ’. ๐Ÿ“
โˆ’๐Ÿ
+๐Ÿ
A
B
+๐Ÿ
+๐Ÿ
+๐Ÿ
+๐Ÿ
bin
Neural Networks
Training Example
Step n=2 โ€“ Output โ€“ ๐‘บ ๐Ÿ‘
๐’€ ๐‘บ3 =
= ๐‘ฉ๐‘ฐ๐‘ต ๐‘บ3
= ๐‘ฉ๐‘ฐ๐‘ต โˆ’. ๐Ÿ“
= ๐ŸŽ
๐’ƒ๐’Š๐’ ๐’” =
+๐Ÿ, ๐’” โ‰ฅ ๐ŸŽ
๐ŸŽ, ๐’” < ๐ŸŽ
BA
01
1 => 1
10
00
0 => 0
11
s
๐‘ฟ ๐Ÿ
๐‘ฟ ๐Ÿ
๐’€๐’‹
+1
โˆ’๐Ÿ. ๐Ÿ“
+1
โˆ’. ๐Ÿ“
1/0
+1
โˆ’. ๐Ÿ“
โˆ’๐Ÿ
+๐Ÿ
A
B
+๐Ÿ
+๐Ÿ
+๐Ÿ
+๐Ÿ
bin
Neural Networks
Training Example
Step n=2 - Output
๐’€ ๐’ = ๐’€ ๐Ÿ = ๐’€ ๐‘บ3
= ๐ŸŽ
BA
01
1 => 1
10
00
0 => 0
11
s
๐‘ฟ ๐Ÿ
๐‘ฟ ๐Ÿ
๐’€๐’‹
+1
โˆ’๐Ÿ. ๐Ÿ“
+1
โˆ’. ๐Ÿ“
1/0
+1
โˆ’. ๐Ÿ“
โˆ’๐Ÿ
+๐Ÿ
A
B
+๐Ÿ
+๐Ÿ
+๐Ÿ
+๐Ÿ
bin
Neural Networks
Training Example
Step n=2
Predicted Vs. Desired
๐’€ ๐’ = ๐’€ ๐Ÿ = ๐ŸŽ
๐ ๐’ = ๐’… ๐Ÿ = ๐ŸŽ
โˆต ๐’€ ๐’ = ๐’… ๐’
โˆด Weights are Correct.
No Adaptation
BA
01
1 => 1
10
00
0 => 0
11
s
๐‘ฟ ๐Ÿ
๐‘ฟ ๐Ÿ
๐’€๐’‹
+1
โˆ’๐Ÿ. ๐Ÿ“
+1
โˆ’. ๐Ÿ“
1/0
+1
โˆ’. ๐Ÿ“
โˆ’๐Ÿ
+๐Ÿ
A
B
+๐Ÿ
+๐Ÿ
+๐Ÿ
+๐Ÿ
bin
Neural Networks
Training Example
Step n=3
โ€ข In each step in the solution, the parameters of the neural network
must be known.
โ€ข Parameters of step n=3:
ฮท = .001
๐‘‹ ๐‘› = ๐‘‹ 3 = +1, +1, +1,1, 1
๐‘Š ๐‘› = ๐‘Š 3 = ๐‘Š 2 = ๐‘1, ๐‘2, ๐‘3, ๐‘ค1, ๐‘ค1, ๐‘ค2, ๐‘ค3, ๐‘ค4, ๐‘ค5, ๐‘ค6
= โˆ’1.5, โˆ’.5, โˆ’.5, 1, 1, 1, 1, โˆ’2, 1
๐‘‘ ๐‘› = ๐‘‘ 3 = 0
BA
01
1 => 1
10
00
0 => 0
11
Neural Networks
Training Example
Step n=3
s
๐‘ฟ ๐Ÿ
๐‘ฟ ๐Ÿ
๐’€๐’‹
+1
โˆ’๐Ÿ. ๐Ÿ“
+1
โˆ’. ๐Ÿ“
1/0
+1
โˆ’. ๐Ÿ“
โˆ’๐Ÿ
+๐Ÿ
A
B
+๐Ÿ
+๐Ÿ
+๐Ÿ
+๐Ÿ
bin
BA
01
1 => 1
10
00
0 => 0
11
Neural Networks
Training Example
Step n=3 โ€“ SOP โ€“ ๐‘บ ๐Ÿ
๐‘บ ๐Ÿ=(+๐Ÿ๐’ƒ ๐Ÿ+๐‘ฟ ๐Ÿ ๐‘พ ๐Ÿ+๐‘ฟ ๐Ÿ ๐‘พ ๐Ÿ‘)
=+1*-1.5+1*1+1*1
=.5
BA
01
1 => 1
10
00
0 => 0
11
s
๐‘ฟ ๐Ÿ
๐‘ฟ ๐Ÿ
๐’€๐’‹
+1
โˆ’๐Ÿ. ๐Ÿ“
+1
โˆ’. ๐Ÿ“
1/0
+1
โˆ’. ๐Ÿ“
โˆ’๐Ÿ
+๐Ÿ
A
B
+๐Ÿ
+๐Ÿ
+๐Ÿ
+๐Ÿ
bin
Neural Networks
Training Example
Step n=3 โ€“ Output โ€“ ๐‘บ ๐Ÿ
๐’€ ๐‘บ ๐Ÿ =
= ๐‘ฉ๐‘ฐ๐‘ต ๐‘บ ๐Ÿ
= ๐‘ฉ๐‘ฐ๐‘ต . ๐Ÿ“
= 1
๐’ƒ๐’Š๐’ ๐’” =
+๐Ÿ, ๐’” โ‰ฅ ๐ŸŽ
๐ŸŽ, ๐’” < ๐ŸŽ
BA
01
1 => 1
10
00
0 => 0
11
s
๐‘ฟ ๐Ÿ
๐‘ฟ ๐Ÿ
๐’€๐’‹
+1
โˆ’๐Ÿ. ๐Ÿ“
+1
โˆ’. ๐Ÿ“
1/0
+1
โˆ’. ๐Ÿ“
โˆ’๐Ÿ
+๐Ÿ
A
B
+๐Ÿ
+๐Ÿ
+๐Ÿ
+๐Ÿ
bin
Neural Networks
Training Example
Step n=3 โ€“ SOP โ€“ ๐‘บ ๐Ÿ
๐‘บ ๐Ÿ=(+๐Ÿ๐’ƒ ๐Ÿ+๐‘ฟ ๐Ÿ ๐‘พ ๐Ÿ+๐‘ฟ ๐Ÿ ๐‘พ ๐Ÿ’)
=+1*-.5+1*1+1*1
=1.5
BA
01
1 => 1
10
00
0 => 0
11
s
๐‘ฟ ๐Ÿ
๐‘ฟ ๐Ÿ
๐’€๐’‹
+1
โˆ’๐Ÿ. ๐Ÿ“
+1
โˆ’. ๐Ÿ“
1/0
+1
โˆ’. ๐Ÿ“
โˆ’๐Ÿ
+๐Ÿ
A
B
+๐Ÿ
+๐Ÿ
+๐Ÿ
+๐Ÿ
bin
Neural Networks
Training Example
Step n=3 โ€“ Output โ€“ ๐‘บ ๐Ÿ
๐’€ ๐‘บ2 =
= ๐‘ฉ๐‘ฐ๐‘ต ๐‘บ2
= ๐‘ฉ๐‘ฐ๐‘ต ๐Ÿ. ๐Ÿ“
= 1
๐’ƒ๐’Š๐’ ๐’” =
+๐Ÿ, ๐’” โ‰ฅ ๐ŸŽ
โˆ’๐Ÿ, ๐’” < ๐ŸŽ
BA
01
1 => 1
10
00
0 => 0
11
s
๐‘ฟ ๐Ÿ
๐‘ฟ ๐Ÿ
๐’€๐’‹
+1
โˆ’๐Ÿ. ๐Ÿ“
+1
โˆ’. ๐Ÿ“
1/0
+1
โˆ’. ๐Ÿ“
โˆ’๐Ÿ
+๐Ÿ
A
B
+๐Ÿ
+๐Ÿ
+๐Ÿ
+๐Ÿ
bin
Neural Networks
Training Example
Step n=3 โ€“ SOP โ€“ ๐‘บ ๐Ÿ‘
๐‘บ ๐Ÿ‘=(+๐Ÿ๐’ƒ ๐Ÿ‘+๐‘บ ๐Ÿ ๐‘พ ๐Ÿ“+๐‘บ ๐Ÿ ๐‘พ ๐Ÿ”)
=+1*-.5+1*-2+1*1
=-1.5
BA
01
1 => 1
10
00
0 => 0
11
s
๐‘ฟ ๐Ÿ
๐‘ฟ ๐Ÿ
๐’€๐’‹
+1
โˆ’๐Ÿ. ๐Ÿ“
+1
โˆ’. ๐Ÿ“
1/0
+1
โˆ’. ๐Ÿ“
โˆ’๐Ÿ
+๐Ÿ
A
B
+๐Ÿ
+๐Ÿ
+๐Ÿ
+๐Ÿ
bin
Neural Networks
Training Example
Step n=3 โ€“ Output โ€“ ๐‘บ ๐Ÿ‘
๐’€ ๐‘บ3 =
= ๐‘ฉ๐‘ฐ๐‘ต ๐‘บ3
= ๐‘ฉ๐‘ฐ๐‘ต โˆ’๐Ÿ. ๐Ÿ“
= ๐ŸŽ
๐’ƒ๐’Š๐’ ๐’” =
+๐Ÿ, ๐’” โ‰ฅ ๐ŸŽ
๐ŸŽ, ๐’” < ๐ŸŽ
BA
01
1 => 1
10
00
0 => 0
11
s
๐‘ฟ ๐Ÿ
๐‘ฟ ๐Ÿ
๐’€๐’‹
+1
โˆ’๐Ÿ. ๐Ÿ“
+1
โˆ’. ๐Ÿ“
1/0
+1
โˆ’. ๐Ÿ“
โˆ’๐Ÿ
+๐Ÿ
A
B
+๐Ÿ
+๐Ÿ
+๐Ÿ
+๐Ÿ
bin
Neural Networks
Training Example
Step n=3 - Output
๐’€ ๐’ = ๐’€ ๐Ÿ‘ = ๐’€ ๐‘บ3
= ๐ŸŽ
BA
01
1 => 1
10
00
0 => 0
11
s
๐‘ฟ ๐Ÿ
๐‘ฟ ๐Ÿ
๐’€๐’‹
+1
โˆ’๐Ÿ. ๐Ÿ“
+1
โˆ’. ๐Ÿ“
1/0
+1
โˆ’. ๐Ÿ“
โˆ’๐Ÿ
+๐Ÿ
A
B
+๐Ÿ
+๐Ÿ
+๐Ÿ
+๐Ÿ
bin
Neural Networks
Training Example
Step n=3
Predicted Vs. Desired
๐’€ ๐’ = ๐’€ ๐Ÿ‘ = ๐ŸŽ
๐ ๐’ = ๐’… ๐Ÿ‘ = ๐ŸŽ
โˆต ๐’€ ๐’ = ๐’… ๐’
โˆด Weights are Correct.
No Adaptation
BA
01
1 => 1
10
00
0 => 0
11
s
๐‘ฟ ๐Ÿ
๐‘ฟ ๐Ÿ
๐’€๐’‹
+1
โˆ’๐Ÿ. ๐Ÿ“
+1
โˆ’. ๐Ÿ“
1/0
+1
โˆ’. ๐Ÿ“
โˆ’๐Ÿ
+๐Ÿ
A
B
+๐Ÿ
+๐Ÿ
+๐Ÿ
+๐Ÿ
bin
Final Weights
s
๐‘ฟ ๐Ÿ
๐‘ฟ ๐Ÿ
๐’€๐’‹
+1
โˆ’๐Ÿ. ๐Ÿ“
+1
โˆ’. ๐Ÿ“
1/0
+1
โˆ’. ๐Ÿ“
โˆ’๐Ÿ
+๐Ÿ
A
B
+๐Ÿ
+๐Ÿ
+๐Ÿ
+๐Ÿ
bin
Current weights predicted
the desired outputs.

More Related Content

What's hot

Multilayer perceptron
Multilayer perceptronMultilayer perceptron
Multilayer perceptronomaraldabash
ย 
Perceptron & Neural Networks
Perceptron & Neural NetworksPerceptron & Neural Networks
Perceptron & Neural NetworksNAGUR SHAREEF SHAIK
ย 
Feedforward neural network
Feedforward neural networkFeedforward neural network
Feedforward neural networkSopheaktra YONG
ย 
What Is Deep Learning? | Introduction to Deep Learning | Deep Learning Tutori...
What Is Deep Learning? | Introduction to Deep Learning | Deep Learning Tutori...What Is Deep Learning? | Introduction to Deep Learning | Deep Learning Tutori...
What Is Deep Learning? | Introduction to Deep Learning | Deep Learning Tutori...Simplilearn
ย 
Deep Learning - Convolutional Neural Networks
Deep Learning - Convolutional Neural NetworksDeep Learning - Convolutional Neural Networks
Deep Learning - Convolutional Neural NetworksChristian Perone
ย 
Perceptron and Sigmoid Neurons
Perceptron and Sigmoid NeuronsPerceptron and Sigmoid Neurons
Perceptron and Sigmoid NeuronsShajun Nisha
ย 
Neural Networks: Multilayer Perceptron
Neural Networks: Multilayer PerceptronNeural Networks: Multilayer Perceptron
Neural Networks: Multilayer PerceptronMostafa G. M. Mostafa
ย 
Feed forward ,back propagation,gradient descent
Feed forward ,back propagation,gradient descentFeed forward ,back propagation,gradient descent
Feed forward ,back propagation,gradient descentMuhammad Rasel
ย 
Perceptron (neural network)
Perceptron (neural network)Perceptron (neural network)
Perceptron (neural network)EdutechLearners
ย 
Deep learning - A Visual Introduction
Deep learning - A Visual IntroductionDeep learning - A Visual Introduction
Deep learning - A Visual IntroductionLukas Masuch
ย 
Introduction to Neural Networks
Introduction to Neural NetworksIntroduction to Neural Networks
Introduction to Neural NetworksDatabricks
ย 
Multi Layer Perceptron & Back Propagation
Multi Layer Perceptron & Back PropagationMulti Layer Perceptron & Back Propagation
Multi Layer Perceptron & Back PropagationSung-ju Kim
ย 
MACHINE LEARNING - GENETIC ALGORITHM
MACHINE LEARNING - GENETIC ALGORITHMMACHINE LEARNING - GENETIC ALGORITHM
MACHINE LEARNING - GENETIC ALGORITHMPuneet Kulyana
ย 
Artificial neural network
Artificial neural networkArtificial neural network
Artificial neural networkVajiheh Zoghiyan
ย 
Artificial Neural Networks - ANN
Artificial Neural Networks - ANNArtificial Neural Networks - ANN
Artificial Neural Networks - ANNMohamed Talaat
ย 
Ann
Ann Ann
Ann vini89
ย 
Artificial intelligence- Logic Agents
Artificial intelligence- Logic AgentsArtificial intelligence- Logic Agents
Artificial intelligence- Logic AgentsNuruzzaman Milon
ย 
Support Vector Machines for Classification
Support Vector Machines for ClassificationSupport Vector Machines for Classification
Support Vector Machines for ClassificationPrakash Pimpale
ย 

What's hot (20)

Support vector machine
Support vector machineSupport vector machine
Support vector machine
ย 
Multilayer perceptron
Multilayer perceptronMultilayer perceptron
Multilayer perceptron
ย 
Perceptron & Neural Networks
Perceptron & Neural NetworksPerceptron & Neural Networks
Perceptron & Neural Networks
ย 
Feedforward neural network
Feedforward neural networkFeedforward neural network
Feedforward neural network
ย 
What Is Deep Learning? | Introduction to Deep Learning | Deep Learning Tutori...
What Is Deep Learning? | Introduction to Deep Learning | Deep Learning Tutori...What Is Deep Learning? | Introduction to Deep Learning | Deep Learning Tutori...
What Is Deep Learning? | Introduction to Deep Learning | Deep Learning Tutori...
ย 
Deep Learning - Convolutional Neural Networks
Deep Learning - Convolutional Neural NetworksDeep Learning - Convolutional Neural Networks
Deep Learning - Convolutional Neural Networks
ย 
Perceptron and Sigmoid Neurons
Perceptron and Sigmoid NeuronsPerceptron and Sigmoid Neurons
Perceptron and Sigmoid Neurons
ย 
Neural Networks: Multilayer Perceptron
Neural Networks: Multilayer PerceptronNeural Networks: Multilayer Perceptron
Neural Networks: Multilayer Perceptron
ย 
Feed forward ,back propagation,gradient descent
Feed forward ,back propagation,gradient descentFeed forward ,back propagation,gradient descent
Feed forward ,back propagation,gradient descent
ย 
Perceptron (neural network)
Perceptron (neural network)Perceptron (neural network)
Perceptron (neural network)
ย 
Deep learning - A Visual Introduction
Deep learning - A Visual IntroductionDeep learning - A Visual Introduction
Deep learning - A Visual Introduction
ย 
Introduction to Neural Networks
Introduction to Neural NetworksIntroduction to Neural Networks
Introduction to Neural Networks
ย 
Multi Layer Perceptron & Back Propagation
Multi Layer Perceptron & Back PropagationMulti Layer Perceptron & Back Propagation
Multi Layer Perceptron & Back Propagation
ย 
MACHINE LEARNING - GENETIC ALGORITHM
MACHINE LEARNING - GENETIC ALGORITHMMACHINE LEARNING - GENETIC ALGORITHM
MACHINE LEARNING - GENETIC ALGORITHM
ย 
Artificial neural network
Artificial neural networkArtificial neural network
Artificial neural network
ย 
Artificial Neural Networks - ANN
Artificial Neural Networks - ANNArtificial Neural Networks - ANN
Artificial Neural Networks - ANN
ย 
Ann
Ann Ann
Ann
ย 
Unit 1
Unit 1Unit 1
Unit 1
ย 
Artificial intelligence- Logic Agents
Artificial intelligence- Logic AgentsArtificial intelligence- Logic Agents
Artificial intelligence- Logic Agents
ย 
Support Vector Machines for Classification
Support Vector Machines for ClassificationSupport Vector Machines for Classification
Support Vector Machines for Classification
ย 

Similar to Artificial Neural Networks (ANNs) - XOR - Step-By-Step

Introduction to Artificial Neural Networks (ANNs) - Step-by-Step Training & T...
Introduction to Artificial Neural Networks (ANNs) - Step-by-Step Training & T...Introduction to Artificial Neural Networks (ANNs) - Step-by-Step Training & T...
Introduction to Artificial Neural Networks (ANNs) - Step-by-Step Training & T...Ahmed Gad
ย 
Digital Electronics Fundamentals
Digital Electronics Fundamentals Digital Electronics Fundamentals
Digital Electronics Fundamentals Darwin Nesakumar
ย 
Neural Network Back Propagation Algorithm
Neural Network Back Propagation AlgorithmNeural Network Back Propagation Algorithm
Neural Network Back Propagation AlgorithmMartin Opdam
ย 
Fourier analysis presentation for thunder chasers
Fourier analysis presentation for thunder chasersFourier analysis presentation for thunder chasers
Fourier analysis presentation for thunder chasersMohammad Iftekher Ebne Jalal
ย 
Combinational logic 2
Combinational logic 2Combinational logic 2
Combinational logic 2Heman Pathak
ย 
Digital-Logic40124sequential circuits logic gatepptx
Digital-Logic40124sequential circuits logic gatepptxDigital-Logic40124sequential circuits logic gatepptx
Digital-Logic40124sequential circuits logic gatepptxssuser6feece1
ย 
CH11-Digital Logic.pptx
CH11-Digital Logic.pptxCH11-Digital Logic.pptx
CH11-Digital Logic.pptxFathoniMahardika1
ย 
Square of an Input Number - Digital Logic Design | Lecture 5
Square of an Input Number - Digital Logic Design | Lecture 5Square of an Input Number - Digital Logic Design | Lecture 5
Square of an Input Number - Digital Logic Design | Lecture 5JalpaMaheshwari1
ย 
Logic circuit2017
Logic circuit2017Logic circuit2017
Logic circuit2017Lhorelie Arcega
ย 
Feedback amplifier
Feedback amplifierFeedback amplifier
Feedback amplifiersaju Sajube82
ย 
Logic gates and logic circuits
Logic gates and logic circuitsLogic gates and logic circuits
Logic gates and logic circuitsjyoti_lakhani
ย 
Deep learning simplified
Deep learning simplifiedDeep learning simplified
Deep learning simplifiedLovelyn Rose
ย 
OCR GCSE Computing - Binary logic and Truth Tables
OCR GCSE Computing - Binary logic and Truth TablesOCR GCSE Computing - Binary logic and Truth Tables
OCR GCSE Computing - Binary logic and Truth Tablesnorthernkiwi
ย 
Logic gates presentation
Logic gates presentationLogic gates presentation
Logic gates presentationpriyanka bisarya
ย 
Dcs lec03 - z-analysis of discrete time control systems
Dcs   lec03 - z-analysis of discrete time control systemsDcs   lec03 - z-analysis of discrete time control systems
Dcs lec03 - z-analysis of discrete time control systemsAmr E. Mohamed
ย 
Mba admission in india
Mba admission in indiaMba admission in india
Mba admission in indiaEdhole.com
ย 
Lec16 Intro to Computer Engineering by Hsien-Hsin Sean Lee Georgia Tech -- Fi...
Lec16 Intro to Computer Engineering by Hsien-Hsin Sean Lee Georgia Tech -- Fi...Lec16 Intro to Computer Engineering by Hsien-Hsin Sean Lee Georgia Tech -- Fi...
Lec16 Intro to Computer Engineering by Hsien-Hsin Sean Lee Georgia Tech -- Fi...Hsien-Hsin Sean Lee, Ph.D.
ย 
Computer archi&mp
Computer archi&mpComputer archi&mp
Computer archi&mpMSc CST
ย 

Similar to Artificial Neural Networks (ANNs) - XOR - Step-By-Step (20)

Introduction to Artificial Neural Networks (ANNs) - Step-by-Step Training & T...
Introduction to Artificial Neural Networks (ANNs) - Step-by-Step Training & T...Introduction to Artificial Neural Networks (ANNs) - Step-by-Step Training & T...
Introduction to Artificial Neural Networks (ANNs) - Step-by-Step Training & T...
ย 
Digital Electronics Fundamentals
Digital Electronics Fundamentals Digital Electronics Fundamentals
Digital Electronics Fundamentals
ย 
Neural Network Back Propagation Algorithm
Neural Network Back Propagation AlgorithmNeural Network Back Propagation Algorithm
Neural Network Back Propagation Algorithm
ย 
Fourier analysis presentation for thunder chasers
Fourier analysis presentation for thunder chasersFourier analysis presentation for thunder chasers
Fourier analysis presentation for thunder chasers
ย 
Combinational logic 2
Combinational logic 2Combinational logic 2
Combinational logic 2
ย 
Digital-Logic40124sequential circuits logic gatepptx
Digital-Logic40124sequential circuits logic gatepptxDigital-Logic40124sequential circuits logic gatepptx
Digital-Logic40124sequential circuits logic gatepptx
ย 
CH11-Digital Logic.pptx
CH11-Digital Logic.pptxCH11-Digital Logic.pptx
CH11-Digital Logic.pptx
ย 
Square of an Input Number - Digital Logic Design | Lecture 5
Square of an Input Number - Digital Logic Design | Lecture 5Square of an Input Number - Digital Logic Design | Lecture 5
Square of an Input Number - Digital Logic Design | Lecture 5
ย 
Mod 3.pptx
Mod 3.pptxMod 3.pptx
Mod 3.pptx
ย 
Logic circuit2017
Logic circuit2017Logic circuit2017
Logic circuit2017
ย 
Feedback amplifier
Feedback amplifierFeedback amplifier
Feedback amplifier
ย 
Logic gates and logic circuits
Logic gates and logic circuitsLogic gates and logic circuits
Logic gates and logic circuits
ย 
Deep learning simplified
Deep learning simplifiedDeep learning simplified
Deep learning simplified
ย 
Logic Equation Simplification
Logic Equation SimplificationLogic Equation Simplification
Logic Equation Simplification
ย 
OCR GCSE Computing - Binary logic and Truth Tables
OCR GCSE Computing - Binary logic and Truth TablesOCR GCSE Computing - Binary logic and Truth Tables
OCR GCSE Computing - Binary logic and Truth Tables
ย 
Logic gates presentation
Logic gates presentationLogic gates presentation
Logic gates presentation
ย 
Dcs lec03 - z-analysis of discrete time control systems
Dcs   lec03 - z-analysis of discrete time control systemsDcs   lec03 - z-analysis of discrete time control systems
Dcs lec03 - z-analysis of discrete time control systems
ย 
Mba admission in india
Mba admission in indiaMba admission in india
Mba admission in india
ย 
Lec16 Intro to Computer Engineering by Hsien-Hsin Sean Lee Georgia Tech -- Fi...
Lec16 Intro to Computer Engineering by Hsien-Hsin Sean Lee Georgia Tech -- Fi...Lec16 Intro to Computer Engineering by Hsien-Hsin Sean Lee Georgia Tech -- Fi...
Lec16 Intro to Computer Engineering by Hsien-Hsin Sean Lee Georgia Tech -- Fi...
ย 
Computer archi&mp
Computer archi&mpComputer archi&mp
Computer archi&mp
ย 

More from Ahmed Gad

ICEIT'20 Cython for Speeding-up Genetic Algorithm
ICEIT'20 Cython for Speeding-up Genetic AlgorithmICEIT'20 Cython for Speeding-up Genetic Algorithm
ICEIT'20 Cython for Speeding-up Genetic AlgorithmAhmed Gad
ย 
NumPyCNNAndroid: A Library for Straightforward Implementation of Convolutiona...
NumPyCNNAndroid: A Library for Straightforward Implementation of Convolutiona...NumPyCNNAndroid: A Library for Straightforward Implementation of Convolutiona...
NumPyCNNAndroid: A Library for Straightforward Implementation of Convolutiona...Ahmed Gad
ย 
Python for Computer Vision - Revision 2nd Edition
Python for Computer Vision - Revision 2nd EditionPython for Computer Vision - Revision 2nd Edition
Python for Computer Vision - Revision 2nd EditionAhmed Gad
ย 
Multi-Objective Optimization using Non-Dominated Sorting Genetic Algorithm wi...
Multi-Objective Optimization using Non-Dominated Sorting Genetic Algorithm wi...Multi-Objective Optimization using Non-Dominated Sorting Genetic Algorithm wi...
Multi-Objective Optimization using Non-Dominated Sorting Genetic Algorithm wi...Ahmed Gad
ย 
M.Sc. Thesis - Automatic People Counting in Crowded Scenes
M.Sc. Thesis - Automatic People Counting in Crowded ScenesM.Sc. Thesis - Automatic People Counting in Crowded Scenes
M.Sc. Thesis - Automatic People Counting in Crowded ScenesAhmed Gad
ย 
Derivation of Convolutional Neural Network from Fully Connected Network Step-...
Derivation of Convolutional Neural Network from Fully Connected Network Step-...Derivation of Convolutional Neural Network from Fully Connected Network Step-...
Derivation of Convolutional Neural Network from Fully Connected Network Step-...Ahmed Gad
ย 
Introduction to Optimization with Genetic Algorithm (GA)
Introduction to Optimization with Genetic Algorithm (GA)Introduction to Optimization with Genetic Algorithm (GA)
Introduction to Optimization with Genetic Algorithm (GA)Ahmed Gad
ย 
Derivation of Convolutional Neural Network (ConvNet) from Fully Connected Net...
Derivation of Convolutional Neural Network (ConvNet) from Fully Connected Net...Derivation of Convolutional Neural Network (ConvNet) from Fully Connected Net...
Derivation of Convolutional Neural Network (ConvNet) from Fully Connected Net...Ahmed Gad
ย 
Avoid Overfitting with Regularization
Avoid Overfitting with RegularizationAvoid Overfitting with Regularization
Avoid Overfitting with RegularizationAhmed Gad
ย 
Genetic Algorithm (GA) Optimization - Step-by-Step Example
Genetic Algorithm (GA) Optimization - Step-by-Step ExampleGenetic Algorithm (GA) Optimization - Step-by-Step Example
Genetic Algorithm (GA) Optimization - Step-by-Step ExampleAhmed Gad
ย 
ICCES 2017 - Crowd Density Estimation Method using Regression Analysis
ICCES 2017 - Crowd Density Estimation Method using Regression AnalysisICCES 2017 - Crowd Density Estimation Method using Regression Analysis
ICCES 2017 - Crowd Density Estimation Method using Regression AnalysisAhmed Gad
ย 
Computer Vision: Correlation, Convolution, and Gradient
Computer Vision: Correlation, Convolution, and GradientComputer Vision: Correlation, Convolution, and Gradient
Computer Vision: Correlation, Convolution, and GradientAhmed Gad
ย 
Python for Computer Vision - Revision
Python for Computer Vision - RevisionPython for Computer Vision - Revision
Python for Computer Vision - RevisionAhmed Gad
ย 
Anime Studio Pro 10 Tutorial as Part of Multimedia Course
Anime Studio Pro 10 Tutorial as Part of Multimedia CourseAnime Studio Pro 10 Tutorial as Part of Multimedia Course
Anime Studio Pro 10 Tutorial as Part of Multimedia CourseAhmed Gad
ย 
Operations in Digital Image Processing + Convolution by Example
Operations in Digital Image Processing + Convolution by ExampleOperations in Digital Image Processing + Convolution by Example
Operations in Digital Image Processing + Convolution by ExampleAhmed Gad
ย 
MATLAB Code + Description : Real-Time Object Motion Detection and Tracking
MATLAB Code + Description : Real-Time Object Motion Detection and TrackingMATLAB Code + Description : Real-Time Object Motion Detection and Tracking
MATLAB Code + Description : Real-Time Object Motion Detection and TrackingAhmed Gad
ย 
MATLAB Code + Description : Very Simple Automatic English Optical Character R...
MATLAB Code + Description : Very Simple Automatic English Optical Character R...MATLAB Code + Description : Very Simple Automatic English Optical Character R...
MATLAB Code + Description : Very Simple Automatic English Optical Character R...Ahmed Gad
ย 
Graduation Project - Face Login : A Robust Face Identification System for Sec...
Graduation Project - Face Login : A Robust Face Identification System for Sec...Graduation Project - Face Login : A Robust Face Identification System for Sec...
Graduation Project - Face Login : A Robust Face Identification System for Sec...Ahmed Gad
ย 
Introduction to MATrices LABoratory (MATLAB) as Part of Digital Signal Proces...
Introduction to MATrices LABoratory (MATLAB) as Part of Digital Signal Proces...Introduction to MATrices LABoratory (MATLAB) as Part of Digital Signal Proces...
Introduction to MATrices LABoratory (MATLAB) as Part of Digital Signal Proces...Ahmed Gad
ย 
Introduction to Digital Signal Processing (DSP) - Course Notes
Introduction to Digital Signal Processing (DSP) - Course NotesIntroduction to Digital Signal Processing (DSP) - Course Notes
Introduction to Digital Signal Processing (DSP) - Course NotesAhmed Gad
ย 

More from Ahmed Gad (20)

ICEIT'20 Cython for Speeding-up Genetic Algorithm
ICEIT'20 Cython for Speeding-up Genetic AlgorithmICEIT'20 Cython for Speeding-up Genetic Algorithm
ICEIT'20 Cython for Speeding-up Genetic Algorithm
ย 
NumPyCNNAndroid: A Library for Straightforward Implementation of Convolutiona...
NumPyCNNAndroid: A Library for Straightforward Implementation of Convolutiona...NumPyCNNAndroid: A Library for Straightforward Implementation of Convolutiona...
NumPyCNNAndroid: A Library for Straightforward Implementation of Convolutiona...
ย 
Python for Computer Vision - Revision 2nd Edition
Python for Computer Vision - Revision 2nd EditionPython for Computer Vision - Revision 2nd Edition
Python for Computer Vision - Revision 2nd Edition
ย 
Multi-Objective Optimization using Non-Dominated Sorting Genetic Algorithm wi...
Multi-Objective Optimization using Non-Dominated Sorting Genetic Algorithm wi...Multi-Objective Optimization using Non-Dominated Sorting Genetic Algorithm wi...
Multi-Objective Optimization using Non-Dominated Sorting Genetic Algorithm wi...
ย 
M.Sc. Thesis - Automatic People Counting in Crowded Scenes
M.Sc. Thesis - Automatic People Counting in Crowded ScenesM.Sc. Thesis - Automatic People Counting in Crowded Scenes
M.Sc. Thesis - Automatic People Counting in Crowded Scenes
ย 
Derivation of Convolutional Neural Network from Fully Connected Network Step-...
Derivation of Convolutional Neural Network from Fully Connected Network Step-...Derivation of Convolutional Neural Network from Fully Connected Network Step-...
Derivation of Convolutional Neural Network from Fully Connected Network Step-...
ย 
Introduction to Optimization with Genetic Algorithm (GA)
Introduction to Optimization with Genetic Algorithm (GA)Introduction to Optimization with Genetic Algorithm (GA)
Introduction to Optimization with Genetic Algorithm (GA)
ย 
Derivation of Convolutional Neural Network (ConvNet) from Fully Connected Net...
Derivation of Convolutional Neural Network (ConvNet) from Fully Connected Net...Derivation of Convolutional Neural Network (ConvNet) from Fully Connected Net...
Derivation of Convolutional Neural Network (ConvNet) from Fully Connected Net...
ย 
Avoid Overfitting with Regularization
Avoid Overfitting with RegularizationAvoid Overfitting with Regularization
Avoid Overfitting with Regularization
ย 
Genetic Algorithm (GA) Optimization - Step-by-Step Example
Genetic Algorithm (GA) Optimization - Step-by-Step ExampleGenetic Algorithm (GA) Optimization - Step-by-Step Example
Genetic Algorithm (GA) Optimization - Step-by-Step Example
ย 
ICCES 2017 - Crowd Density Estimation Method using Regression Analysis
ICCES 2017 - Crowd Density Estimation Method using Regression AnalysisICCES 2017 - Crowd Density Estimation Method using Regression Analysis
ICCES 2017 - Crowd Density Estimation Method using Regression Analysis
ย 
Computer Vision: Correlation, Convolution, and Gradient
Computer Vision: Correlation, Convolution, and GradientComputer Vision: Correlation, Convolution, and Gradient
Computer Vision: Correlation, Convolution, and Gradient
ย 
Python for Computer Vision - Revision
Python for Computer Vision - RevisionPython for Computer Vision - Revision
Python for Computer Vision - Revision
ย 
Anime Studio Pro 10 Tutorial as Part of Multimedia Course
Anime Studio Pro 10 Tutorial as Part of Multimedia CourseAnime Studio Pro 10 Tutorial as Part of Multimedia Course
Anime Studio Pro 10 Tutorial as Part of Multimedia Course
ย 
Operations in Digital Image Processing + Convolution by Example
Operations in Digital Image Processing + Convolution by ExampleOperations in Digital Image Processing + Convolution by Example
Operations in Digital Image Processing + Convolution by Example
ย 
MATLAB Code + Description : Real-Time Object Motion Detection and Tracking
MATLAB Code + Description : Real-Time Object Motion Detection and TrackingMATLAB Code + Description : Real-Time Object Motion Detection and Tracking
MATLAB Code + Description : Real-Time Object Motion Detection and Tracking
ย 
MATLAB Code + Description : Very Simple Automatic English Optical Character R...
MATLAB Code + Description : Very Simple Automatic English Optical Character R...MATLAB Code + Description : Very Simple Automatic English Optical Character R...
MATLAB Code + Description : Very Simple Automatic English Optical Character R...
ย 
Graduation Project - Face Login : A Robust Face Identification System for Sec...
Graduation Project - Face Login : A Robust Face Identification System for Sec...Graduation Project - Face Login : A Robust Face Identification System for Sec...
Graduation Project - Face Login : A Robust Face Identification System for Sec...
ย 
Introduction to MATrices LABoratory (MATLAB) as Part of Digital Signal Proces...
Introduction to MATrices LABoratory (MATLAB) as Part of Digital Signal Proces...Introduction to MATrices LABoratory (MATLAB) as Part of Digital Signal Proces...
Introduction to MATrices LABoratory (MATLAB) as Part of Digital Signal Proces...
ย 
Introduction to Digital Signal Processing (DSP) - Course Notes
Introduction to Digital Signal Processing (DSP) - Course NotesIntroduction to Digital Signal Processing (DSP) - Course Notes
Introduction to Digital Signal Processing (DSP) - Course Notes
ย 

Recently uploaded

MULTIDISCIPLINRY NATURE OF THE ENVIRONMENTAL STUDIES.pptx
MULTIDISCIPLINRY NATURE OF THE ENVIRONMENTAL STUDIES.pptxMULTIDISCIPLINRY NATURE OF THE ENVIRONMENTAL STUDIES.pptx
MULTIDISCIPLINRY NATURE OF THE ENVIRONMENTAL STUDIES.pptxAnupkumar Sharma
ย 
4.16.24 21st Century Movements for Black Lives.pptx
4.16.24 21st Century Movements for Black Lives.pptx4.16.24 21st Century Movements for Black Lives.pptx
4.16.24 21st Century Movements for Black Lives.pptxmary850239
ย 
Visit to a blind student's school๐Ÿง‘โ€๐Ÿฆฏ๐Ÿง‘โ€๐Ÿฆฏ(community medicine)
Visit to a blind student's school๐Ÿง‘โ€๐Ÿฆฏ๐Ÿง‘โ€๐Ÿฆฏ(community medicine)Visit to a blind student's school๐Ÿง‘โ€๐Ÿฆฏ๐Ÿง‘โ€๐Ÿฆฏ(community medicine)
Visit to a blind student's school๐Ÿง‘โ€๐Ÿฆฏ๐Ÿง‘โ€๐Ÿฆฏ(community medicine)lakshayb543
ย 
How to do quick user assign in kanban in Odoo 17 ERP
How to do quick user assign in kanban in Odoo 17 ERPHow to do quick user assign in kanban in Odoo 17 ERP
How to do quick user assign in kanban in Odoo 17 ERPCeline George
ย 
Global Lehigh Strategic Initiatives (without descriptions)
Global Lehigh Strategic Initiatives (without descriptions)Global Lehigh Strategic Initiatives (without descriptions)
Global Lehigh Strategic Initiatives (without descriptions)cama23
ย 
Barangay Council for the Protection of Children (BCPC) Orientation.pptx
Barangay Council for the Protection of Children (BCPC) Orientation.pptxBarangay Council for the Protection of Children (BCPC) Orientation.pptx
Barangay Council for the Protection of Children (BCPC) Orientation.pptxCarlos105
ย 
ICS2208 Lecture6 Notes for SL spaces.pdf
ICS2208 Lecture6 Notes for SL spaces.pdfICS2208 Lecture6 Notes for SL spaces.pdf
ICS2208 Lecture6 Notes for SL spaces.pdfVanessa Camilleri
ย 
Transaction Management in Database Management System
Transaction Management in Database Management SystemTransaction Management in Database Management System
Transaction Management in Database Management SystemChristalin Nelson
ย 
Concurrency Control in Database Management system
Concurrency Control in Database Management systemConcurrency Control in Database Management system
Concurrency Control in Database Management systemChristalin Nelson
ย 
Integumentary System SMP B. Pharm Sem I.ppt
Integumentary System SMP B. Pharm Sem I.pptIntegumentary System SMP B. Pharm Sem I.ppt
Integumentary System SMP B. Pharm Sem I.pptshraddhaparab530
ย 
Daily Lesson Plan in Mathematics Quarter 4
Daily Lesson Plan in Mathematics Quarter 4Daily Lesson Plan in Mathematics Quarter 4
Daily Lesson Plan in Mathematics Quarter 4JOYLYNSAMANIEGO
ย 
GRADE 4 - SUMMATIVE TEST QUARTER 4 ALL SUBJECTS
GRADE 4 - SUMMATIVE TEST QUARTER 4 ALL SUBJECTSGRADE 4 - SUMMATIVE TEST QUARTER 4 ALL SUBJECTS
GRADE 4 - SUMMATIVE TEST QUARTER 4 ALL SUBJECTSJoshuaGantuangco2
ย 
ECONOMIC CONTEXT - PAPER 1 Q3: NEWSPAPERS.pptx
ECONOMIC CONTEXT - PAPER 1 Q3: NEWSPAPERS.pptxECONOMIC CONTEXT - PAPER 1 Q3: NEWSPAPERS.pptx
ECONOMIC CONTEXT - PAPER 1 Q3: NEWSPAPERS.pptxiammrhaywood
ย 
INTRODUCTION TO CATHOLIC CHRISTOLOGY.pptx
INTRODUCTION TO CATHOLIC CHRISTOLOGY.pptxINTRODUCTION TO CATHOLIC CHRISTOLOGY.pptx
INTRODUCTION TO CATHOLIC CHRISTOLOGY.pptxHumphrey A Beรฑa
ย 
Karra SKD Conference Presentation Revised.pptx
Karra SKD Conference Presentation Revised.pptxKarra SKD Conference Presentation Revised.pptx
Karra SKD Conference Presentation Revised.pptxAshokKarra1
ย 
Incoming and Outgoing Shipments in 3 STEPS Using Odoo 17
Incoming and Outgoing Shipments in 3 STEPS Using Odoo 17Incoming and Outgoing Shipments in 3 STEPS Using Odoo 17
Incoming and Outgoing Shipments in 3 STEPS Using Odoo 17Celine George
ย 
Field Attribute Index Feature in Odoo 17
Field Attribute Index Feature in Odoo 17Field Attribute Index Feature in Odoo 17
Field Attribute Index Feature in Odoo 17Celine George
ย 

Recently uploaded (20)

MULTIDISCIPLINRY NATURE OF THE ENVIRONMENTAL STUDIES.pptx
MULTIDISCIPLINRY NATURE OF THE ENVIRONMENTAL STUDIES.pptxMULTIDISCIPLINRY NATURE OF THE ENVIRONMENTAL STUDIES.pptx
MULTIDISCIPLINRY NATURE OF THE ENVIRONMENTAL STUDIES.pptx
ย 
FINALS_OF_LEFT_ON_C'N_EL_DORADO_2024.pptx
FINALS_OF_LEFT_ON_C'N_EL_DORADO_2024.pptxFINALS_OF_LEFT_ON_C'N_EL_DORADO_2024.pptx
FINALS_OF_LEFT_ON_C'N_EL_DORADO_2024.pptx
ย 
4.16.24 21st Century Movements for Black Lives.pptx
4.16.24 21st Century Movements for Black Lives.pptx4.16.24 21st Century Movements for Black Lives.pptx
4.16.24 21st Century Movements for Black Lives.pptx
ย 
Visit to a blind student's school๐Ÿง‘โ€๐Ÿฆฏ๐Ÿง‘โ€๐Ÿฆฏ(community medicine)
Visit to a blind student's school๐Ÿง‘โ€๐Ÿฆฏ๐Ÿง‘โ€๐Ÿฆฏ(community medicine)Visit to a blind student's school๐Ÿง‘โ€๐Ÿฆฏ๐Ÿง‘โ€๐Ÿฆฏ(community medicine)
Visit to a blind student's school๐Ÿง‘โ€๐Ÿฆฏ๐Ÿง‘โ€๐Ÿฆฏ(community medicine)
ย 
YOUVE GOT EMAIL_FINALS_EL_DORADO_2024.pptx
YOUVE GOT EMAIL_FINALS_EL_DORADO_2024.pptxYOUVE GOT EMAIL_FINALS_EL_DORADO_2024.pptx
YOUVE GOT EMAIL_FINALS_EL_DORADO_2024.pptx
ย 
LEFT_ON_C'N_ PRELIMS_EL_DORADO_2024.pptx
LEFT_ON_C'N_ PRELIMS_EL_DORADO_2024.pptxLEFT_ON_C'N_ PRELIMS_EL_DORADO_2024.pptx
LEFT_ON_C'N_ PRELIMS_EL_DORADO_2024.pptx
ย 
How to do quick user assign in kanban in Odoo 17 ERP
How to do quick user assign in kanban in Odoo 17 ERPHow to do quick user assign in kanban in Odoo 17 ERP
How to do quick user assign in kanban in Odoo 17 ERP
ย 
Global Lehigh Strategic Initiatives (without descriptions)
Global Lehigh Strategic Initiatives (without descriptions)Global Lehigh Strategic Initiatives (without descriptions)
Global Lehigh Strategic Initiatives (without descriptions)
ย 
Barangay Council for the Protection of Children (BCPC) Orientation.pptx
Barangay Council for the Protection of Children (BCPC) Orientation.pptxBarangay Council for the Protection of Children (BCPC) Orientation.pptx
Barangay Council for the Protection of Children (BCPC) Orientation.pptx
ย 
ICS2208 Lecture6 Notes for SL spaces.pdf
ICS2208 Lecture6 Notes for SL spaces.pdfICS2208 Lecture6 Notes for SL spaces.pdf
ICS2208 Lecture6 Notes for SL spaces.pdf
ย 
Transaction Management in Database Management System
Transaction Management in Database Management SystemTransaction Management in Database Management System
Transaction Management in Database Management System
ย 
Concurrency Control in Database Management system
Concurrency Control in Database Management systemConcurrency Control in Database Management system
Concurrency Control in Database Management system
ย 
Integumentary System SMP B. Pharm Sem I.ppt
Integumentary System SMP B. Pharm Sem I.pptIntegumentary System SMP B. Pharm Sem I.ppt
Integumentary System SMP B. Pharm Sem I.ppt
ย 
Daily Lesson Plan in Mathematics Quarter 4
Daily Lesson Plan in Mathematics Quarter 4Daily Lesson Plan in Mathematics Quarter 4
Daily Lesson Plan in Mathematics Quarter 4
ย 
GRADE 4 - SUMMATIVE TEST QUARTER 4 ALL SUBJECTS
GRADE 4 - SUMMATIVE TEST QUARTER 4 ALL SUBJECTSGRADE 4 - SUMMATIVE TEST QUARTER 4 ALL SUBJECTS
GRADE 4 - SUMMATIVE TEST QUARTER 4 ALL SUBJECTS
ย 
ECONOMIC CONTEXT - PAPER 1 Q3: NEWSPAPERS.pptx
ECONOMIC CONTEXT - PAPER 1 Q3: NEWSPAPERS.pptxECONOMIC CONTEXT - PAPER 1 Q3: NEWSPAPERS.pptx
ECONOMIC CONTEXT - PAPER 1 Q3: NEWSPAPERS.pptx
ย 
INTRODUCTION TO CATHOLIC CHRISTOLOGY.pptx
INTRODUCTION TO CATHOLIC CHRISTOLOGY.pptxINTRODUCTION TO CATHOLIC CHRISTOLOGY.pptx
INTRODUCTION TO CATHOLIC CHRISTOLOGY.pptx
ย 
Karra SKD Conference Presentation Revised.pptx
Karra SKD Conference Presentation Revised.pptxKarra SKD Conference Presentation Revised.pptx
Karra SKD Conference Presentation Revised.pptx
ย 
Incoming and Outgoing Shipments in 3 STEPS Using Odoo 17
Incoming and Outgoing Shipments in 3 STEPS Using Odoo 17Incoming and Outgoing Shipments in 3 STEPS Using Odoo 17
Incoming and Outgoing Shipments in 3 STEPS Using Odoo 17
ย 
Field Attribute Index Feature in Odoo 17
Field Attribute Index Feature in Odoo 17Field Attribute Index Feature in Odoo 17
Field Attribute Index Feature in Odoo 17
ย 

Artificial Neural Networks (ANNs) - XOR - Step-By-Step

  • 1. Artificial Neural Networks (ANNs) XOR Step-By-Step MENOUFIA UNIVERSITY FACULTY OF COMPUTERS AND INFORMATION ALL DEPARTMENTS ARTIFICIAL INTELLIGENCE โ€ซุงู„ู…ู†ูˆููŠุฉโ€ฌ โ€ซุฌุงู…ุนุฉโ€ฌ โ€ซูˆุงู„ู…ุนู„ูˆู…ุงุชโ€ฌ โ€ซุงู„ุญุงุณุจุงุชโ€ฌ โ€ซูƒู„ูŠุฉโ€ฌ โ€ซุงุฃู„ู‚ุณุงู…โ€ฌ โ€ซุฌู…ูŠุนโ€ฌ โ€ซุงู„ุฐูƒุงุกโ€ฌโ€ซุงุฅู„ุตุทู†ุงุนูŠโ€ฌ โ€ซุงู„ู…ู†ูˆููŠุฉโ€ฌ โ€ซุฌุงู…ุนุฉโ€ฌ Ahmed Fawzy Gad ahmed.fawzy@ci.menofia.edu.eg
  • 3. Neural Networks Input Hidden Output BA 01 1 10 00 0 11
  • 5. Neural Networks BA 01 1 10 00 0 11 0 0.2 0.4 0.6 0.8 1 1.2 0 0.2 0.4 0.6 0.8 1 1.2 Can`t be Solved Linearly. Single Layer Perceptron Can`t Work. Use Hidden Layer.
  • 8. Hidden Layer Start by Two Neurons Input Output BA 01 1 10 00 0 11 0 0.2 0.4 0.6 0.8 1 1.2 0 0.2 0.4 0.6 0.8 1 1.2 Hidden A B
  • 9. Output Layer Input Output BA 01 1 10 00 0 11 0 0.2 0.4 0.6 0.8 1 1.2 0 0.2 0.4 0.6 0.8 1 1.2 Hidden 1/0 ๐’€๐’‹ A B
  • 10. Weights Input Output BA 01 1 10 00 0 11 0 0.2 0.4 0.6 0.8 1 1.2 0 0.2 0.4 0.6 0.8 1 1.2 Hidden Weights=๐‘พ๐’Š A B 1/0 ๐’€๐’‹
  • 11. Weights Input Layer โ€“ Hidden Layer Input Output BA 01 1 10 00 0 11 0 0.2 0.4 0.6 0.8 1 1.2 0 0.2 0.4 0.6 0.8 1 1.2 Hidden Weights=๐‘พ๐’Š ๐‘พ ๐Ÿ ๐‘พ ๐Ÿ ๐‘พ ๐Ÿ‘ ๐‘พ ๐Ÿ’ A B 1/0 ๐’€๐’‹
  • 12. Weights Hidden Layer โ€“ Output Layer Input Output BA 01 1 10 00 0 11 0 0.2 0.4 0.6 0.8 1 1.2 0 0.2 0.4 0.6 0.8 1 1.2 Hidden Weights=๐‘พ๐’Š ๐‘พ ๐Ÿ“ ๐‘พ ๐Ÿ” A B 1/0 ๐’€๐’‹ ๐‘พ ๐Ÿ ๐‘พ ๐Ÿ ๐‘พ ๐Ÿ‘ ๐‘พ ๐Ÿ’
  • 13. All Layers Input Output BA 01 1 10 00 0 11 0 0.2 0.4 0.6 0.8 1 1.2 0 0.2 0.4 0.6 0.8 1 1.2 Hidden Weights=๐‘พ๐’Š ๐‘พ ๐Ÿ“ ๐‘พ ๐Ÿ” 1/0 ๐’€๐’‹ A B ๐‘พ ๐Ÿ ๐‘พ ๐Ÿ ๐‘พ ๐Ÿ‘ ๐‘พ ๐Ÿ’
  • 14. Activation Function Output ๐’€๐’‹ BA 01 1 10 00 0 11 Input Hidden 1/0 0 0.2 0.4 0.6 0.8 1 1.2 0 0.2 0.4 0.6 0.8 1 1.2 ๐‘พ ๐Ÿ“ ๐‘พ ๐Ÿ” A B ๐‘พ ๐Ÿ ๐‘พ ๐Ÿ ๐‘พ ๐Ÿ‘ ๐‘พ ๐Ÿ’
  • 15. Activation Function Output ๐’€๐’‹ BA 01 1 10 00 0 11 Input Hidden 0 0.2 0.4 0.6 0.8 1 1.2 0 0.2 0.4 0.6 0.8 1 1.2 1/0 ๐‘พ ๐Ÿ“ ๐‘พ ๐Ÿ” A B ๐‘พ ๐Ÿ ๐‘พ ๐Ÿ ๐‘พ ๐Ÿ‘ ๐‘พ ๐Ÿ’
  • 16. Activation Function Output ๐’€๐’‹ BA 01 1 10 00 0 11 0 0.2 0.4 0.6 0.8 1 1.2 0 0.2 0.4 0.6 0.8 1 1.2 Input Hidden 1/0 ๐‘พ ๐Ÿ“ ๐‘พ ๐Ÿ” A B ๐‘พ ๐Ÿ ๐‘พ ๐Ÿ ๐‘พ ๐Ÿ‘ ๐‘พ ๐Ÿ’
  • 17. Activation Function Components Output ๐’€๐’‹ BA 01 1 10 00 0 11 0 0.2 0.4 0.6 0.8 1 1.2 0 0.2 0.4 0.6 0.8 1 1.2 Input Hidden 1/0 ๐‘พ ๐Ÿ“ ๐‘พ ๐Ÿ” A B ๐‘พ ๐Ÿ ๐‘พ ๐Ÿ ๐‘พ ๐Ÿ‘ ๐‘พ ๐Ÿ’
  • 18. Activation Function Inputs Output s ๐’€๐’‹ BA 01 1 10 00 0 11 0 0.2 0.4 0.6 0.8 1 1.2 0 0.2 0.4 0.6 0.8 1 1.2 Input Hidden 1/0 ๐‘พ ๐Ÿ“ ๐‘พ ๐Ÿ” A B ๐‘พ ๐Ÿ ๐‘พ ๐Ÿ ๐‘พ ๐Ÿ‘ ๐‘พ ๐Ÿ’
  • 19. Activation Function Inputs Output s ๐’€๐’‹ BA 01 1 10 00 0 11 0 0.2 0.4 0.6 0.8 1 1.2 0 0.2 0.4 0.6 0.8 1 1.2 Input Hidden 1/0 ๐‘พ ๐Ÿ“ ๐‘พ ๐Ÿ” A B ๐‘พ ๐Ÿ ๐‘พ ๐Ÿ ๐‘พ ๐Ÿ‘ ๐‘พ ๐Ÿ’ s=SOP(๐‘ฟ๐’Š, ๐‘พ๐’Š)
  • 20. Activation Function Inputs Output s ๐‘ฟ๐’Š=Inputs ๐‘พ๐’Š=Weights ๐’€๐’‹ BA 01 1 10 00 0 11 0 0.2 0.4 0.6 0.8 1 1.2 0 0.2 0.4 0.6 0.8 1 1.2 Input Hidden 1/0 ๐‘พ ๐Ÿ“ ๐‘พ ๐Ÿ” A B ๐‘พ ๐Ÿ ๐‘พ ๐Ÿ ๐‘พ ๐Ÿ‘ ๐‘พ ๐Ÿ’ s=SOP(๐‘ฟ๐’Š, ๐‘พ๐’Š)
  • 21. Activation Function Inputs Output s ๐‘ฟ๐’Š=Inputs ๐‘พ๐’Š=Weights ๐‘ฟ ๐Ÿ ๐‘ฟ ๐Ÿ ๐’€๐’‹ BA 01 1 10 00 0 11 0 0.2 0.4 0.6 0.8 1 1.2 0 0.2 0.4 0.6 0.8 1 1.2 Input Hidden 1/0 ๐‘พ ๐Ÿ“ ๐‘พ ๐Ÿ” A B ๐‘พ ๐Ÿ ๐‘พ ๐Ÿ ๐‘พ ๐Ÿ‘ ๐‘พ ๐Ÿ’ s=SOP(๐‘ฟ๐’Š, ๐‘พ๐’Š)
  • 22. ๐‘พ ๐Ÿ“ ๐‘พ ๐Ÿ” A B ๐‘พ ๐Ÿ ๐‘พ ๐Ÿ ๐‘พ ๐Ÿ‘ ๐‘พ ๐Ÿ’ Activation Function Inputs Output s ๐‘ฟ๐’Š=Inputs ๐‘พ๐’Š=Weights ๐‘ฟ ๐Ÿ ๐‘ฟ ๐Ÿ ๐’€๐’‹ BA 01 1 10 00 0 11 0 0.2 0.4 0.6 0.8 1 1.2 0 0.2 0.4 0.6 0.8 1 1.2 Input Hidden S= ๐Ÿ ๐’Ž ๐‘ฟ๐’Š ๐‘พ๐’Š s=SOP(๐‘ฟ๐’Š, ๐‘พ๐’Š) 1/0
  • 23. Activation Function Inputs Output s ๐‘ฟ ๐Ÿ ๐‘ฟ ๐Ÿ ๐’€๐’‹ BA 01 1 10 00 0 11 0 0.2 0.4 0.6 0.8 1 1.2 0 0.2 0.4 0.6 0.8 1 1.2 Input Hidden ๐‘พ ๐Ÿ“ ๐‘พ ๐Ÿ” A B ๐‘พ ๐Ÿ ๐‘พ ๐Ÿ ๐‘พ ๐Ÿ‘ ๐‘พ ๐Ÿ’ S= ๐Ÿ ๐’Ž ๐‘ฟ๐’Š ๐‘พ๐’Š 1/0 Each Hidden/Output Layer Neuron has its SOP.
  • 24. Activation Function Inputs Output s ๐‘ฟ ๐Ÿ ๐‘ฟ ๐Ÿ ๐‘บ ๐Ÿ=(๐‘ฟ ๐Ÿ ๐‘พ ๐Ÿ+๐‘ฟ ๐Ÿ ๐‘พ ๐Ÿ‘) ๐’€๐’‹ BA 01 1 10 00 0 11 0 0.2 0.4 0.6 0.8 1 1.2 0 0.2 0.4 0.6 0.8 1 1.2 Input Hidden ๐‘พ ๐Ÿ“ ๐‘พ ๐Ÿ” A B ๐‘พ ๐Ÿ ๐‘พ ๐Ÿ ๐‘พ ๐Ÿ‘ ๐‘พ ๐Ÿ’ S= ๐Ÿ ๐’Ž ๐‘ฟ๐’Š ๐‘พ๐’Š 1/0
  • 25. Activation Function Inputs Output s ๐‘ฟ ๐Ÿ ๐‘ฟ ๐Ÿ ๐‘บ ๐Ÿ=(๐‘ฟ ๐Ÿ ๐‘พ ๐Ÿ+๐‘ฟ ๐Ÿ ๐‘พ ๐Ÿ’) ๐’€๐’‹ BA 01 1 10 00 0 11 0 0.2 0.4 0.6 0.8 1 1.2 0 0.2 0.4 0.6 0.8 1 1.2 Input Hidden ๐‘พ ๐Ÿ“ ๐‘พ ๐Ÿ” A B ๐‘พ ๐Ÿ ๐‘พ ๐Ÿ ๐‘พ ๐Ÿ‘ ๐‘พ ๐Ÿ’ S= ๐Ÿ ๐’Ž ๐‘ฟ๐’Š ๐‘พ๐’Š 1/0
  • 26. Activation Function Inputs Output s ๐‘ฟ ๐Ÿ ๐‘ฟ ๐Ÿ ๐‘บ ๐Ÿ‘=(๐‘บ ๐Ÿ ๐‘พ ๐Ÿ“+๐‘บ ๐Ÿ ๐‘พ ๐Ÿ”) ๐’€๐’‹ BA 01 1 10 00 0 11 0 0.2 0.4 0.6 0.8 1 1.2 0 0.2 0.4 0.6 0.8 1 1.2 Input Hidden ๐‘พ ๐Ÿ“ ๐‘พ ๐Ÿ” A B ๐‘พ ๐Ÿ ๐‘พ ๐Ÿ ๐‘พ ๐Ÿ‘ ๐‘พ ๐Ÿ’ S= ๐Ÿ ๐’Ž ๐‘ฟ๐’Š ๐‘พ๐’Š 1/0
  • 27. Activation Function Outputs Output F(s)s ๐‘ฟ ๐Ÿ ๐‘ฟ ๐Ÿ Class Label ๐’€๐’‹ BA 01 1 10 00 0 11 0 0.2 0.4 0.6 0.8 1 1.2 0 0.2 0.4 0.6 0.8 1 1.2 Input Hidden 1/0 ๐‘พ ๐Ÿ“ ๐‘พ ๐Ÿ” A B ๐‘พ ๐Ÿ ๐‘พ ๐Ÿ ๐‘พ ๐Ÿ‘ ๐‘พ ๐Ÿ’
  • 28. Activation Function Outputs Output F(s)s ๐‘ฟ ๐Ÿ ๐‘ฟ ๐Ÿ Class Label ๐’€๐’‹ BA 01 1 10 00 0 11 0 0.2 0.4 0.6 0.8 1 1.2 0 0.2 0.4 0.6 0.8 1 1.2 Input Hidden 1/0 Each Hidden/Output Layer Neuron has its Activation Function. ๐‘พ ๐Ÿ“ ๐‘พ ๐Ÿ” A B ๐‘พ ๐Ÿ ๐‘พ ๐Ÿ ๐‘พ ๐Ÿ‘ ๐‘พ ๐Ÿ’
  • 30. Activation Functions Which activation function to use? Outputs Class Labels Activation Function TWO Class Labels TWO Outputs One that gives two outputs. Which activation function to use? ๐‘ช๐’‹๐’€๐’‹ BA 01 1 10 00 0 11 BA 01 1 10 00 0 11
  • 32. Activation Function Output F(s)s ๐‘ฟ ๐Ÿ ๐‘ฟ ๐Ÿ bin ๐’€๐’‹ BA 01 1 10 00 0 11 0 0.2 0.4 0.6 0.8 1 1.2 0 0.2 0.4 0.6 0.8 1 1.2 1/0 Input Hidden ๐‘พ ๐Ÿ“ ๐‘พ ๐Ÿ” A B ๐‘พ ๐Ÿ ๐‘พ ๐Ÿ ๐‘พ ๐Ÿ‘ ๐‘พ ๐Ÿ’
  • 33. Bias Input Output BA 01 1 10 00 0 11 F(s)s ๐‘ฟ ๐Ÿ ๐‘ฟ ๐Ÿ bin ๐’€๐’‹ 1/0 ๐‘พ ๐Ÿ“ ๐‘พ ๐Ÿ” A B ๐‘พ ๐Ÿ ๐‘พ ๐Ÿ ๐‘พ ๐Ÿ‘ ๐‘พ ๐Ÿ’
  • 34. Bias Hidden Layer Neurons Input Output BA 01 1 10 00 0 11 F(s)s ๐‘ฟ ๐Ÿ ๐‘ฟ ๐Ÿ bin ๐’€๐’‹ ๐‘พ ๐Ÿ“ ๐‘พ ๐Ÿ” A B ๐‘พ ๐Ÿ ๐‘พ ๐Ÿ ๐‘พ ๐Ÿ‘ ๐‘พ ๐Ÿ’ +1 ๐’ƒ ๐Ÿ +1 ๐’ƒ ๐Ÿ 1/0
  • 35. Bias Output Layer Neurons Input Output BA 01 1 10 00 0 11 F(s)s ๐‘ฟ ๐Ÿ ๐‘ฟ ๐Ÿ bin ๐’€๐’‹ +1 ๐’ƒ ๐Ÿ‘ 1/0 ๐‘พ ๐Ÿ“ ๐‘พ ๐Ÿ” A B ๐‘พ ๐Ÿ ๐‘พ ๐Ÿ ๐‘พ ๐Ÿ‘ ๐‘พ ๐Ÿ’
  • 36. All Bias Values Input Output BA 01 1 10 00 0 11 F(s)s ๐‘ฟ ๐Ÿ ๐‘ฟ ๐Ÿ bin ๐’€๐’‹ +1 ๐’ƒ ๐Ÿ +1 ๐’ƒ ๐Ÿ 1/0 +1 ๐’ƒ ๐Ÿ‘ ๐‘พ ๐Ÿ“ ๐‘พ ๐Ÿ” A B ๐‘พ ๐Ÿ ๐‘พ ๐Ÿ ๐‘พ ๐Ÿ‘ ๐‘พ ๐Ÿ’
  • 37. Bias Add Bias to SOP Input Output BA 01 1 10 00 0 11 F(s)s ๐‘ฟ ๐Ÿ ๐‘ฟ ๐Ÿ bin ๐’€๐’‹ +1 ๐’ƒ ๐Ÿ +1 ๐’ƒ ๐Ÿ ๐‘บ ๐Ÿ=(๐‘ฟ ๐Ÿ ๐‘พ ๐Ÿ+๐‘ฟ ๐Ÿ ๐‘พ ๐Ÿ‘) ๐‘บ ๐Ÿ=(๐‘ฟ ๐Ÿ ๐‘พ ๐Ÿ+๐‘ฟ ๐Ÿ ๐‘พ ๐Ÿ’) ๐‘บ ๐Ÿ‘=(๐‘บ ๐Ÿ ๐‘พ ๐Ÿ“+๐‘บ ๐Ÿ ๐‘พ ๐Ÿ”) 1/0 +1 ๐’ƒ ๐Ÿ‘ ๐‘พ ๐Ÿ“ ๐‘พ ๐Ÿ” A B ๐‘พ ๐Ÿ ๐‘พ ๐Ÿ ๐‘พ ๐Ÿ‘ ๐‘พ ๐Ÿ’
  • 38. Bias Add Bias to SOP Input Output BA 01 1 10 00 0 11 F(s)s ๐‘ฟ ๐Ÿ ๐‘ฟ ๐Ÿ bin ๐’€๐’‹ +1 ๐’ƒ ๐Ÿ +1 ๐’ƒ ๐Ÿ ๐‘บ ๐Ÿ=(๐‘ฟ ๐Ÿ ๐‘พ ๐Ÿ+๐‘ฟ ๐Ÿ ๐‘พ ๐Ÿ‘) 1/0 +1 ๐’ƒ ๐Ÿ‘ ๐‘บ ๐Ÿ=(+๐Ÿ๐’ƒ ๐Ÿ+๐‘ฟ ๐Ÿ ๐‘พ ๐Ÿ+๐‘ฟ ๐Ÿ ๐‘พ ๐Ÿ‘) ๐‘พ ๐Ÿ“ ๐‘พ ๐Ÿ” A B ๐‘พ ๐Ÿ ๐‘พ ๐Ÿ ๐‘พ ๐Ÿ‘ ๐‘พ ๐Ÿ’
  • 39. Bias Add Bias to SOP Input Output BA 01 1 10 00 0 11 F(s)s ๐‘ฟ ๐Ÿ ๐‘ฟ ๐Ÿ bin ๐’€๐’‹ +1 ๐’ƒ ๐Ÿ +1 ๐’ƒ ๐Ÿ ๐‘บ ๐Ÿ=(๐‘ฟ ๐Ÿ ๐‘พ ๐Ÿ+๐‘ฟ ๐Ÿ ๐‘พ ๐Ÿ’) 1/0 +1 ๐’ƒ ๐Ÿ‘ ๐‘บ ๐Ÿ=(+๐Ÿ๐’ƒ ๐Ÿ+๐‘ฟ ๐Ÿ ๐‘พ ๐Ÿ+๐‘ฟ ๐Ÿ ๐‘พ ๐Ÿ’) ๐‘พ ๐Ÿ“ ๐‘พ ๐Ÿ” A B ๐‘พ ๐Ÿ ๐‘พ ๐Ÿ ๐‘พ ๐Ÿ‘ ๐‘พ ๐Ÿ’
  • 40. Bias Add Bias to SOP Input Output BA 01 1 10 00 0 11 F(s)s ๐‘ฟ ๐Ÿ ๐‘ฟ ๐Ÿ bin ๐’€๐’‹ +1 ๐’ƒ ๐Ÿ +1 ๐’ƒ ๐Ÿ ๐‘บ ๐Ÿ‘=(๐‘บ ๐Ÿ ๐‘พ ๐Ÿ“+๐‘บ ๐Ÿ ๐‘พ ๐Ÿ”) 1/0 +1 ๐’ƒ ๐Ÿ‘ ๐‘บ ๐Ÿ‘=(+๐Ÿ๐’ƒ ๐Ÿ‘+๐‘บ ๐Ÿ ๐‘พ ๐Ÿ“+๐‘บ ๐Ÿ ๐‘พ ๐Ÿ”) ๐‘พ ๐Ÿ“ ๐‘พ ๐Ÿ” A B ๐‘พ ๐Ÿ ๐‘พ ๐Ÿ ๐‘พ ๐Ÿ‘ ๐‘พ ๐Ÿ’
  • 41. Bias Importance Input Output BA 01 1 10 00 0 11 F(s)s ๐‘ฟ ๐Ÿ ๐‘ฟ ๐Ÿ bin ๐’€๐’‹ +1 ๐’ƒ ๐Ÿ +1 ๐’ƒ ๐Ÿ 1/0 +1 ๐’ƒ ๐Ÿ‘ ๐‘พ ๐Ÿ“ ๐‘พ ๐Ÿ” A B ๐‘พ ๐Ÿ ๐‘พ ๐Ÿ ๐‘พ ๐Ÿ‘ ๐‘พ ๐Ÿ’
  • 42. Bias Importance Input Output X Y BA 01 1 10 00 0 11 F(s)s ๐‘ฟ ๐Ÿ ๐‘ฟ ๐Ÿ bin ๐’€๐’‹ +1 ๐’ƒ ๐Ÿ +1 ๐’ƒ ๐Ÿ 1/0 +1 ๐’ƒ ๐Ÿ‘ ๐‘พ ๐Ÿ“ ๐‘พ ๐Ÿ” A B ๐‘พ ๐Ÿ ๐‘พ ๐Ÿ ๐‘พ ๐Ÿ‘ ๐‘พ ๐Ÿ’
  • 43. Bias Importance Input Output X Y y=ax+b BA 01 1 10 00 0 11 F(s)s ๐‘ฟ ๐Ÿ ๐‘ฟ ๐Ÿ bin ๐’€๐’‹ +1 ๐’ƒ ๐Ÿ +1 ๐’ƒ ๐Ÿ 1/0 +1 ๐’ƒ ๐Ÿ‘ ๐‘พ ๐Ÿ“ ๐‘พ ๐Ÿ” A B ๐‘พ ๐Ÿ ๐‘พ ๐Ÿ ๐‘พ ๐Ÿ‘ ๐‘พ ๐Ÿ’
  • 44. Bias Importance Input Output X Y y=ax+b BA 01 1 10 00 0 11 F(s)s ๐‘ฟ ๐Ÿ ๐‘ฟ ๐Ÿ bin ๐’€๐’‹ +1 ๐’ƒ ๐Ÿ +1 ๐’ƒ ๐Ÿ 1/0 +1 ๐’ƒ ๐Ÿ‘ ๐‘พ ๐Ÿ“ ๐‘พ ๐Ÿ” A B ๐‘พ ๐Ÿ ๐‘พ ๐Ÿ ๐‘พ ๐Ÿ‘ ๐‘พ ๐Ÿ’
  • 45. Bias Importance Input Output X Y y=ax+b Y-Intercept BA 01 1 10 00 0 11 F(s)s ๐‘ฟ ๐Ÿ ๐‘ฟ ๐Ÿ bin ๐’€๐’‹ +1 ๐’ƒ ๐Ÿ +1 ๐’ƒ ๐Ÿ 1/0 +1 ๐’ƒ ๐Ÿ‘ ๐‘พ ๐Ÿ“ ๐‘พ ๐Ÿ” A B ๐‘พ ๐Ÿ ๐‘พ ๐Ÿ ๐‘พ ๐Ÿ‘ ๐‘พ ๐Ÿ’
  • 46. Bias Importance Input Output X Y y=ax+b Y-Intercept b=0 BA 01 1 10 00 0 11 F(s)s ๐‘ฟ ๐Ÿ ๐‘ฟ ๐Ÿ bin ๐’€๐’‹ +1 ๐’ƒ ๐Ÿ +1 ๐’ƒ ๐Ÿ 1/0 +1 ๐’ƒ ๐Ÿ‘ ๐‘พ ๐Ÿ“ ๐‘พ ๐Ÿ” A B ๐‘พ ๐Ÿ ๐‘พ ๐Ÿ ๐‘พ ๐Ÿ‘ ๐‘พ ๐Ÿ’
  • 47. Bias Importance Input Output X Y Y-Intercept b=0 BA 01 1 10 00 0 11 F(s)s ๐‘ฟ ๐Ÿ ๐‘ฟ ๐Ÿ bin ๐’€๐’‹ +1 ๐’ƒ ๐Ÿ +1 ๐’ƒ ๐Ÿ 1/0 +1 ๐’ƒ ๐Ÿ‘ ๐‘พ ๐Ÿ“ ๐‘พ ๐Ÿ” A B ๐‘พ ๐Ÿ ๐‘พ ๐Ÿ ๐‘พ ๐Ÿ‘ ๐‘พ ๐Ÿ’ y=ax+b
  • 48. Bias Importance Input Output X Y Y-Intercept b=+v BA 01 1 10 00 0 11 F(s)s ๐‘ฟ ๐Ÿ ๐‘ฟ ๐Ÿ bin ๐’€๐’‹ +1 ๐’ƒ ๐Ÿ +1 ๐’ƒ ๐Ÿ 1/0 +1 ๐’ƒ ๐Ÿ‘ ๐‘พ ๐Ÿ“ ๐‘พ ๐Ÿ” A B ๐‘พ ๐Ÿ ๐‘พ ๐Ÿ ๐‘พ ๐Ÿ‘ ๐‘พ ๐Ÿ’ y=ax+b
  • 49. Bias Importance Input Output X Y Y-Intercept b=+v BA 01 1 10 00 0 11 F(s)s ๐‘ฟ ๐Ÿ ๐‘ฟ ๐Ÿ bin ๐’€๐’‹ +1 ๐’ƒ ๐Ÿ +1 ๐’ƒ ๐Ÿ 1/0 +1 ๐’ƒ ๐Ÿ‘ ๐‘พ ๐Ÿ“ ๐‘พ ๐Ÿ” A B ๐‘พ ๐Ÿ ๐‘พ ๐Ÿ ๐‘พ ๐Ÿ‘ ๐‘พ ๐Ÿ’ y=ax+b
  • 50. Bias Importance Input Output X Y Y-Intercept b=-v BA 01 1 10 00 0 11 F(s)s ๐‘ฟ ๐Ÿ ๐‘ฟ ๐Ÿ bin ๐’€๐’‹ +1 ๐’ƒ ๐Ÿ +1 ๐’ƒ ๐Ÿ 1/0 +1 ๐’ƒ ๐Ÿ‘ ๐‘พ ๐Ÿ“ ๐‘พ ๐Ÿ” A B ๐‘พ ๐Ÿ ๐‘พ ๐Ÿ ๐‘พ ๐Ÿ‘ ๐‘พ ๐Ÿ’ y=ax+b
  • 51. Bias Importance Input Output X Y Y-Intercept b=+v BA 01 1 10 00 0 11 F(s)s ๐‘ฟ ๐Ÿ ๐‘ฟ ๐Ÿ bin ๐’€๐’‹ +1 ๐’ƒ ๐Ÿ +1 ๐’ƒ ๐Ÿ 1/0 +1 ๐’ƒ ๐Ÿ‘ ๐‘พ ๐Ÿ“ ๐‘พ ๐Ÿ” A B ๐‘พ ๐Ÿ ๐‘พ ๐Ÿ ๐‘พ ๐Ÿ‘ ๐‘พ ๐Ÿ’ y=ax+b
  • 52. Bias Importance Input Output Same Concept Applies to Bias S= ๐Ÿ ๐’Ž ๐‘ฟ๐’Š ๐‘พ๐’Š+BIAS BA 01 1 10 00 0 11 F(s)s ๐‘ฟ ๐Ÿ ๐‘ฟ ๐Ÿ bin ๐’€๐’‹ +1 ๐’ƒ ๐Ÿ +1 ๐’ƒ ๐Ÿ 1/0 +1 ๐’ƒ ๐Ÿ‘ ๐‘พ ๐Ÿ“ ๐‘พ ๐Ÿ” A B ๐‘พ ๐Ÿ ๐‘พ ๐Ÿ ๐‘พ ๐Ÿ‘ ๐‘พ ๐Ÿ’
  • 53. Bias Importance Input Output S= ๐Ÿ ๐’Ž ๐‘ฟ๐’Š ๐‘พ๐’Š+BIAS BA 01 1 10 00 0 11 F(s)s ๐‘ฟ ๐Ÿ ๐‘ฟ ๐Ÿ bin ๐’€๐’‹ +1 ๐’ƒ ๐Ÿ +1 ๐’ƒ ๐Ÿ 1/0 +1 ๐’ƒ ๐Ÿ‘ ๐‘พ ๐Ÿ“ ๐‘พ ๐Ÿ” A B ๐‘พ ๐Ÿ ๐‘พ ๐Ÿ ๐‘พ ๐Ÿ‘ ๐‘พ ๐Ÿ’
  • 54. Bias Importance Input Output S= ๐Ÿ ๐’Ž ๐‘ฟ๐’Š ๐‘พ๐’Š+BIAS BA 01 1 10 00 0 11 F(s)s ๐‘ฟ ๐Ÿ ๐‘ฟ ๐Ÿ bin ๐’€๐’‹ +1 ๐’ƒ ๐Ÿ +1 ๐’ƒ ๐Ÿ 1/0 +1 ๐’ƒ ๐Ÿ‘ ๐‘พ ๐Ÿ“ ๐‘พ ๐Ÿ” A B ๐‘พ ๐Ÿ ๐‘พ ๐Ÿ ๐‘พ ๐Ÿ‘ ๐‘พ ๐Ÿ’
  • 55. Bias Importance Input Output S= ๐Ÿ ๐’Ž ๐‘ฟ๐’Š ๐‘พ๐’Š+BIAS BA 01 1 10 00 0 11 F(s)s ๐‘ฟ ๐Ÿ ๐‘ฟ ๐Ÿ bin ๐’€๐’‹ +1 ๐’ƒ ๐Ÿ +1 ๐’ƒ ๐Ÿ 1/0 +1 ๐’ƒ ๐Ÿ‘ ๐‘พ ๐Ÿ“ ๐‘พ ๐Ÿ” A B ๐‘พ ๐Ÿ ๐‘พ ๐Ÿ ๐‘พ ๐Ÿ‘ ๐‘พ ๐Ÿ’
  • 56. Bias Importance Input Output S= ๐Ÿ ๐’Ž ๐‘ฟ๐’Š ๐‘พ๐’Š+BIAS BA 01 1 10 00 0 11 F(s)s ๐‘ฟ ๐Ÿ ๐‘ฟ ๐Ÿ bin ๐’€๐’‹ +1 ๐’ƒ ๐Ÿ +1 ๐’ƒ ๐Ÿ 1/0 +1 ๐’ƒ ๐Ÿ‘ ๐‘พ ๐Ÿ“ ๐‘พ ๐Ÿ” A B ๐‘พ ๐Ÿ ๐‘พ ๐Ÿ ๐‘พ ๐Ÿ‘ ๐‘พ ๐Ÿ’
  • 57. Learning Rate ๐ŸŽ โ‰ค ฮท โ‰ค ๐Ÿ F(s)s ๐‘ฟ ๐Ÿ ๐‘ฟ ๐Ÿ bin ๐’€๐’‹ +1 ๐’ƒ ๐Ÿ +1 ๐’ƒ ๐Ÿ 1/0 +1 ๐’ƒ ๐Ÿ‘ ๐‘พ ๐Ÿ“ ๐‘พ ๐Ÿ” A B ๐‘พ ๐Ÿ ๐‘พ ๐Ÿ ๐‘พ ๐Ÿ‘ ๐‘พ ๐Ÿ’
  • 58. Summary of Parameters Inputs ๐‘ฟ ๐’Ž ๐ŸŽ โ‰ค ฮท โ‰ค ๐Ÿ ๐‘ฟ(๐’)=(๐‘ฟ ๐ŸŽ, ๐‘ฟ ๐Ÿ,๐‘ฟ ๐Ÿ, โ€ฆ) F(s)s ๐‘ฟ ๐Ÿ ๐‘ฟ ๐Ÿ bin ๐’€๐’‹ +1 ๐’ƒ ๐Ÿ +1 ๐’ƒ ๐Ÿ 1/0 +1 ๐’ƒ ๐Ÿ‘ ๐‘พ ๐Ÿ“ ๐‘พ ๐Ÿ” A B ๐‘พ ๐Ÿ ๐‘พ ๐Ÿ ๐‘พ ๐Ÿ‘ ๐‘พ ๐Ÿ’
  • 59. Summary of Parameters Weights ๐‘พ ๐’Ž ๐ŸŽ โ‰ค ฮท โ‰ค ๐Ÿ W(๐’)=(๐‘พ ๐ŸŽ, ๐‘พ ๐Ÿ,๐‘พ ๐Ÿ, โ€ฆ) ๐‘ฟ(๐’)=(๐‘ฟ ๐ŸŽ, ๐‘ฟ ๐Ÿ,๐‘ฟ ๐Ÿ, โ€ฆ) F(s)s ๐‘ฟ ๐Ÿ ๐‘ฟ ๐Ÿ bin ๐’€๐’‹ +1 ๐’ƒ ๐Ÿ +1 ๐’ƒ ๐Ÿ 1/0 +1 ๐’ƒ ๐Ÿ‘ ๐‘พ ๐Ÿ“ ๐‘พ ๐Ÿ” A B ๐‘พ ๐Ÿ ๐‘พ ๐Ÿ ๐‘พ ๐Ÿ‘ ๐‘พ ๐Ÿ’
  • 60. Summary of Parameters Bias ๐ŸŽ โ‰ค ฮท โ‰ค ๐Ÿ ๐‘ฟ(๐’)=(๐‘ฟ ๐ŸŽ, ๐‘ฟ ๐Ÿ,๐‘ฟ ๐Ÿ, โ€ฆ) W(๐’)=(๐‘พ ๐ŸŽ, ๐‘พ ๐Ÿ,๐‘พ ๐Ÿ, โ€ฆ) F(s)s ๐‘ฟ ๐Ÿ ๐‘ฟ ๐Ÿ bin ๐’€๐’‹ +1 ๐’ƒ ๐Ÿ +1 ๐’ƒ ๐Ÿ 1/0 +1 ๐’ƒ ๐Ÿ‘ ๐‘พ ๐Ÿ“ ๐‘พ ๐Ÿ” A B ๐‘พ ๐Ÿ ๐‘พ ๐Ÿ ๐‘พ ๐Ÿ‘ ๐‘พ ๐Ÿ’
  • 61. Summary of Parameters Sum Of Products (SOP) ๐’” s=(๐‘ฟ ๐ŸŽ ๐‘พ ๐ŸŽ+๐‘ฟ ๐Ÿ ๐‘พ ๐Ÿ+๐‘ฟ ๐Ÿ ๐‘พ ๐Ÿ+โ€ฆ) ๐ŸŽ โ‰ค ฮท โ‰ค ๐Ÿ ๐‘ฟ(๐’)=(๐‘ฟ ๐ŸŽ, ๐‘ฟ ๐Ÿ,๐‘ฟ ๐Ÿ, โ€ฆ) W(๐’)=(๐‘พ ๐ŸŽ, ๐‘พ ๐Ÿ,๐‘พ ๐Ÿ, โ€ฆ) F(s)s ๐‘ฟ ๐Ÿ ๐‘ฟ ๐Ÿ bin ๐’€๐’‹ +1 ๐’ƒ ๐Ÿ +1 ๐’ƒ ๐Ÿ 1/0 +1 ๐’ƒ ๐Ÿ‘ ๐‘พ ๐Ÿ“ ๐‘พ ๐Ÿ” A B ๐‘พ ๐Ÿ ๐‘พ ๐Ÿ ๐‘พ ๐Ÿ‘ ๐‘พ ๐Ÿ’
  • 62. Summary of Parameters Activation Function ๐’ƒ๐’Š๐’ s=(๐‘ฟ ๐ŸŽ ๐‘พ ๐ŸŽ+๐‘ฟ ๐Ÿ ๐‘พ ๐Ÿ+๐‘ฟ ๐Ÿ ๐‘พ ๐Ÿ+โ€ฆ) ๐ŸŽ โ‰ค ฮท โ‰ค ๐Ÿ ๐‘ฟ(๐’)=(๐‘ฟ ๐ŸŽ, ๐‘ฟ ๐Ÿ,๐‘ฟ ๐Ÿ, โ€ฆ) W(๐’)=(๐‘พ ๐ŸŽ, ๐‘พ ๐Ÿ,๐‘พ ๐Ÿ, โ€ฆ) F(s)s ๐‘ฟ ๐Ÿ ๐‘ฟ ๐Ÿ bin ๐’€๐’‹ +1 ๐’ƒ ๐Ÿ +1 ๐’ƒ ๐Ÿ 1/0 +1 ๐’ƒ ๐Ÿ‘ ๐‘พ ๐Ÿ“ ๐‘พ ๐Ÿ” A B ๐‘พ ๐Ÿ ๐‘พ ๐Ÿ ๐‘พ ๐Ÿ‘ ๐‘พ ๐Ÿ’
  • 63. Summary of Parameters Outputs ๐’€๐’‹ F(s)s ๐‘ฟ ๐Ÿ ๐‘ฟ ๐Ÿ bin ๐’€๐’‹ +1 ๐’ƒ ๐Ÿ +1 ๐’ƒ ๐Ÿ 1/0 +1 ๐’ƒ ๐Ÿ‘ ๐‘พ ๐Ÿ“ ๐‘พ ๐Ÿ” A B ๐‘พ ๐Ÿ ๐‘พ ๐Ÿ ๐‘พ ๐Ÿ‘ ๐‘พ ๐Ÿ’ s=(๐‘ฟ ๐ŸŽ ๐‘พ ๐ŸŽ+๐‘ฟ ๐Ÿ ๐‘พ ๐Ÿ+๐‘ฟ ๐Ÿ ๐‘พ ๐Ÿ+โ€ฆ) ๐ŸŽ โ‰ค ฮท โ‰ค ๐Ÿ ๐‘ฟ(๐’)=(๐‘ฟ ๐ŸŽ, ๐‘ฟ ๐Ÿ,๐‘ฟ ๐Ÿ, โ€ฆ) W(๐’)=(๐‘พ ๐ŸŽ, ๐‘พ ๐Ÿ,๐‘พ ๐Ÿ, โ€ฆ)
  • 64. Summary of Parameters Learning Rate ฮท ๐ŸŽ โ‰ค ฮท โ‰ค ๐Ÿ F(s)s ๐‘ฟ ๐Ÿ ๐‘ฟ ๐Ÿ bin ๐’€๐’‹ +1 ๐’ƒ ๐Ÿ +1 ๐’ƒ ๐Ÿ 1/0 +1 ๐’ƒ ๐Ÿ‘ ๐‘พ ๐Ÿ“ ๐‘พ ๐Ÿ” A B ๐‘พ ๐Ÿ ๐‘พ ๐Ÿ ๐‘พ ๐Ÿ‘ ๐‘พ ๐Ÿ’ s=(๐‘ฟ ๐ŸŽ ๐‘พ ๐ŸŽ+๐‘ฟ ๐Ÿ ๐‘พ ๐Ÿ+๐‘ฟ ๐Ÿ ๐‘พ ๐Ÿ+โ€ฆ) ๐‘ฟ(๐’)=(๐‘ฟ ๐ŸŽ, ๐‘ฟ ๐Ÿ,๐‘ฟ ๐Ÿ, โ€ฆ) W(๐’)=(๐‘พ ๐ŸŽ, ๐‘พ ๐Ÿ,๐‘พ ๐Ÿ, โ€ฆ)
  • 65. Other Parameters Step n ๐’ = ๐ŸŽ, ๐Ÿ, ๐Ÿ, โ€ฆ F(s)s ๐‘ฟ ๐Ÿ ๐‘ฟ ๐Ÿ bin ๐’€๐’‹ +1 ๐’ƒ ๐Ÿ +1 ๐’ƒ ๐Ÿ 1/0 +1 ๐’ƒ ๐Ÿ‘ ๐‘พ ๐Ÿ“ ๐‘พ ๐Ÿ” A B ๐‘พ ๐Ÿ ๐‘พ ๐Ÿ ๐‘พ ๐Ÿ‘ ๐‘พ ๐Ÿ’ s=(๐‘ฟ ๐ŸŽ ๐‘พ ๐ŸŽ+๐‘ฟ ๐Ÿ ๐‘พ ๐Ÿ+๐‘ฟ ๐Ÿ ๐‘พ ๐Ÿ+โ€ฆ) ๐ŸŽ โ‰ค ฮท โ‰ค ๐Ÿ ๐‘ฟ(๐’)=(๐‘ฟ ๐ŸŽ, ๐‘ฟ ๐Ÿ,๐‘ฟ ๐Ÿ, โ€ฆ) W(๐’)=(๐‘พ ๐ŸŽ, ๐‘พ ๐Ÿ,๐‘พ ๐Ÿ, โ€ฆ)
  • 66. Other Parameters Desired Output ๐’…๐’‹ ๐’ = ๐ŸŽ, ๐Ÿ, ๐Ÿ, โ€ฆ ๐’… ๐’ = ๐Ÿ, ๐’™ ๐’ ๐’ƒ๐’†๐’๐’๐’๐’ˆ๐’” ๐’•๐’ ๐‘ช๐Ÿ (๐Ÿ) ๐ŸŽ, ๐’™ ๐’ ๐’ƒ๐’†๐’๐’๐’๐’ˆ๐’” ๐’•๐’ ๐‘ช๐Ÿ (๐ŸŽ) BA 01 1 10 00 0 11 F(s)s ๐‘ฟ ๐Ÿ ๐‘ฟ ๐Ÿ bin ๐’€๐’‹ +1 ๐’ƒ ๐Ÿ +1 ๐’ƒ ๐Ÿ 1/0 +1 ๐’ƒ ๐Ÿ‘ ๐‘พ ๐Ÿ“ ๐‘พ ๐Ÿ” A B ๐‘พ ๐Ÿ ๐‘พ ๐Ÿ ๐‘พ ๐Ÿ‘ ๐‘พ ๐Ÿ’ s=(๐‘ฟ ๐ŸŽ ๐‘พ ๐ŸŽ+๐‘ฟ ๐Ÿ ๐‘พ ๐Ÿ+๐‘ฟ ๐Ÿ ๐‘พ ๐Ÿ+โ€ฆ) ๐ŸŽ โ‰ค ฮท โ‰ค ๐Ÿ ๐‘ฟ(๐’)=(๐‘ฟ ๐ŸŽ, ๐‘ฟ ๐Ÿ,๐‘ฟ ๐Ÿ, โ€ฆ) W(๐’)=(๐‘พ ๐ŸŽ, ๐‘พ ๐Ÿ,๐‘พ ๐Ÿ, โ€ฆ)
  • 67. Neural Networks Training Steps Weights Initialization Inputs Application Sum of Inputs-Weights Products Activation Function Response Calculation Weights Adaptation Back to Step 2 1 2 3 4 5 6
  • 68. Regarding 5th Step: Weights Adaptation โ€ข If the predicted output Y is not the same as the desired output d, then weights are to be adapted according to the following equation: ๐‘พ ๐’ + ๐Ÿ = ๐‘พ ๐’ + ฮท ๐’… ๐’ โˆ’ ๐’€ ๐’ ๐‘ฟ(๐’) Where ๐‘พ ๐’ = [๐’ƒ ๐’ , ๐‘พ ๐Ÿ(๐’), ๐‘พ ๐Ÿ(๐’), ๐‘พ ๐Ÿ‘(๐’), โ€ฆ , ๐‘พ ๐’Ž(๐’)]
  • 69. Neural Networks Training Example Step n=0 โ€ข In each step in the solution, the parameters of the neural network must be known. โ€ข Parameters of step n=0: ฮท = .001 ๐‘‹ ๐‘› = ๐‘‹ 0 = +1, +1, +1,1, 0 ๐‘Š ๐‘› = ๐‘Š 0 = ๐‘1, ๐‘2, ๐‘3, ๐‘ค1, ๐‘ค2, ๐‘ค3, ๐‘ค4, ๐‘ค5, ๐‘ค6 = โˆ’1.5, โˆ’.5, โˆ’.5, 1, 1, 1, 1, โˆ’2, 1 ๐‘‘ ๐‘› = ๐‘‘ 0 = 1 BA 01 1 => 1 10 00 0 => 0 11
  • 70. Neural Networks Training Example Step n=0 s ๐‘ฟ ๐Ÿ ๐‘ฟ ๐Ÿ ๐’€๐’‹ +1 โˆ’๐Ÿ. ๐Ÿ“ +1 โˆ’. ๐Ÿ“ 1/0 +1 โˆ’. ๐Ÿ“ โˆ’2 +๐Ÿ A B +๐Ÿ +๐Ÿ +๐Ÿ +๐Ÿ bin BA 01 1 => 1 10 00 0 => 0 11
  • 71. Neural Networks Training Example Step n=0 โ€“ SOP โ€“ ๐‘บ ๐Ÿ ๐‘บ ๐Ÿ=(+๐Ÿ๐’ƒ ๐Ÿ+๐‘ฟ ๐Ÿ ๐‘พ ๐Ÿ+๐‘ฟ ๐Ÿ ๐‘พ ๐Ÿ‘) =+1*-1.5+1*1+0*1 =-.5 BA 01 1 => 1 10 00 0 => 0 11 s ๐‘ฟ ๐Ÿ ๐‘ฟ ๐Ÿ ๐’€๐’‹ +1 โˆ’๐Ÿ. ๐Ÿ“ +1 โˆ’. ๐Ÿ“ 1/0 +1 โˆ’. ๐Ÿ“ โˆ’2 +๐Ÿ A B +๐Ÿ +๐Ÿ +๐Ÿ +๐Ÿ bin
  • 72. Neural Networks Training Example Step n=0 โ€“ Output โ€“ ๐‘บ ๐Ÿ ๐’€ ๐‘บ ๐Ÿ = = ๐‘ฉ๐‘ฐ๐‘ต ๐‘บ ๐Ÿ = ๐‘ฉ๐‘ฐ๐‘ต โˆ’. ๐Ÿ“ = ๐ŸŽ ๐’ƒ๐’Š๐’ ๐’” = +๐Ÿ, ๐’” โ‰ฅ ๐ŸŽ ๐ŸŽ, ๐’” < ๐ŸŽ BA 01 1 => 1 10 00 0 => 0 11 s ๐‘ฟ ๐Ÿ ๐‘ฟ ๐Ÿ ๐’€๐’‹ +1 โˆ’๐Ÿ. ๐Ÿ“ +1 โˆ’. ๐Ÿ“ 1/0 +1 โˆ’. ๐Ÿ“ โˆ’2 +๐Ÿ A B +๐Ÿ +๐Ÿ +๐Ÿ +๐Ÿ bin
  • 73. Neural Networks Training Example Step n=0 โ€“ SOP โ€“ ๐‘บ ๐Ÿ ๐‘บ ๐Ÿ=(+๐Ÿ๐’ƒ ๐Ÿ+๐‘ฟ ๐Ÿ ๐‘พ ๐Ÿ+๐‘ฟ ๐Ÿ ๐‘พ ๐Ÿ’) =+1*-.5+1*1+0*1 =.5 BA 01 1 => 1 10 00 0 => 0 11 s ๐‘ฟ ๐Ÿ ๐‘ฟ ๐Ÿ ๐’€๐’‹ +1 โˆ’๐Ÿ. ๐Ÿ“ +1 โˆ’. ๐Ÿ“ 1/0 +1 โˆ’. ๐Ÿ“ โˆ’2 +๐Ÿ A B +๐Ÿ +๐Ÿ +๐Ÿ +๐Ÿ bin
  • 74. Neural Networks Training Example Step n=0 โ€“ Output โ€“ ๐‘บ ๐Ÿ ๐’€ ๐‘บ2 = = ๐‘ฉ๐‘ฐ๐‘ต ๐‘บ2 = ๐‘ฉ๐‘ฐ๐‘ต . ๐Ÿ“ = 1 ๐’ƒ๐’Š๐’ ๐’” = +๐Ÿ, ๐’” โ‰ฅ ๐ŸŽ โˆ’๐Ÿ, ๐’” < ๐ŸŽ BA 01 1 => 1 10 00 0 => 0 11 s ๐‘ฟ ๐Ÿ ๐‘ฟ ๐Ÿ ๐’€๐’‹ +1 โˆ’๐Ÿ. ๐Ÿ“ +1 โˆ’. ๐Ÿ“ 1/0 +1 โˆ’. ๐Ÿ“ โˆ’2 +๐Ÿ A B +๐Ÿ +๐Ÿ +๐Ÿ +๐Ÿ bin
  • 75. Neural Networks Training Example Step n=0 โ€“ SOP โ€“ ๐‘บ ๐Ÿ‘ ๐‘บ ๐Ÿ‘=(+๐Ÿ๐’ƒ ๐Ÿ‘+๐‘บ ๐Ÿ ๐‘พ ๐Ÿ“+๐‘บ ๐Ÿ ๐‘พ ๐Ÿ”) =+1*-.5+0*-2+1*1 =.5 BA 01 1 => 1 10 00 0 => 0 11 s ๐‘ฟ ๐Ÿ ๐‘ฟ ๐Ÿ ๐’€๐’‹ +1 โˆ’๐Ÿ. ๐Ÿ“ +1 โˆ’. ๐Ÿ“ 1/0 +1 โˆ’. ๐Ÿ“ โˆ’2 +๐Ÿ A B +๐Ÿ +๐Ÿ +๐Ÿ +๐Ÿ bin
  • 76. Neural Networks Training Example Step n=0 โ€“ Output โ€“ ๐‘บ ๐Ÿ‘ ๐’€ ๐‘บ3 = = ๐‘ฉ๐‘ฐ๐‘ต ๐‘บ3 = ๐‘ฉ๐‘ฐ๐‘ต . ๐Ÿ“ = 1 ๐’ƒ๐’Š๐’ ๐’” = +๐Ÿ, ๐’” โ‰ฅ ๐ŸŽ ๐ŸŽ, ๐’” < ๐ŸŽ BA 01 1 => 1 10 00 0 => 0 11 s ๐‘ฟ ๐Ÿ ๐‘ฟ ๐Ÿ ๐’€๐’‹ +1 โˆ’๐Ÿ. ๐Ÿ“ +1 โˆ’. ๐Ÿ“ 1/0 +1 โˆ’. ๐Ÿ“ โˆ’2 +๐Ÿ A B +๐Ÿ +๐Ÿ +๐Ÿ +๐Ÿ bin
  • 77. Neural Networks Training Example Step n=0 - Output ๐’€ ๐’ = ๐’€ ๐ŸŽ = ๐’€ ๐‘บ3 = 1 BA 01 1 => 1 10 00 0 => 0 11 s ๐‘ฟ ๐Ÿ ๐‘ฟ ๐Ÿ ๐’€๐’‹ +1 โˆ’๐Ÿ. ๐Ÿ“ +1 โˆ’. ๐Ÿ“ 1/0 +1 โˆ’. ๐Ÿ“ โˆ’2 +๐Ÿ A B +๐Ÿ +๐Ÿ +๐Ÿ +๐Ÿ bin
  • 78. Neural Networks Training Example Step n=0 Predicted Vs. Desired ๐’€ ๐’ = ๐’€ ๐ŸŽ = 1 ๐ ๐’ = ๐’… ๐ŸŽ = 1 โˆต ๐’€ ๐’ = ๐’… ๐’ โˆด Weights are Correct. No Adaptation BA 01 1 => 1 10 00 0 => 0 11 s ๐‘ฟ ๐Ÿ ๐‘ฟ ๐Ÿ ๐’€๐’‹ +1 โˆ’๐Ÿ. ๐Ÿ“ +1 โˆ’. ๐Ÿ“ 1/0 +1 โˆ’. ๐Ÿ“ โˆ’2 +๐Ÿ A B +๐Ÿ +๐Ÿ +๐Ÿ +๐Ÿ bin
  • 79. Neural Networks Training Example Step n=1 โ€ข In each step in the solution, the parameters of the neural network must be known. โ€ข Parameters of step n=1: ฮท = .001 ๐‘‹ ๐‘› = ๐‘‹ 1 = +1, +1, +1,0, 1 ๐‘Š ๐‘› = ๐‘Š 1 = ๐‘Š 0 = ๐‘1, ๐‘2, ๐‘3, ๐‘ค1, ๐‘ค1, ๐‘ค2, ๐‘ค3, ๐‘ค4, ๐‘ค5, ๐‘ค6 = โˆ’1.5, โˆ’.5, โˆ’.5, 1, 1, 1, 1, โˆ’2, 1 ๐‘‘ ๐‘› = ๐‘‘ 1 = +1 BA 01 1 => 1 10 00 0 => 0 11
  • 80. Neural Networks Training Example Step n=1 s ๐‘ฟ ๐Ÿ ๐‘ฟ ๐Ÿ ๐’€๐’‹ +1 โˆ’๐Ÿ. ๐Ÿ“ +1 โˆ’. ๐Ÿ“ 1/0 +1 โˆ’. ๐Ÿ“ โˆ’2 +๐Ÿ A B +๐Ÿ +๐Ÿ +๐Ÿ +๐Ÿ bin BA 01 1 => 1 10 00 0 => 0 11
  • 81. Neural Networks Training Example Step n=1 โ€“ SOP โ€“ ๐‘บ ๐Ÿ ๐‘บ ๐Ÿ=(+๐Ÿ๐’ƒ ๐Ÿ+๐‘ฟ ๐Ÿ ๐‘พ ๐Ÿ+๐‘ฟ ๐Ÿ ๐‘พ ๐Ÿ‘) =+1*-1.5+0*1+1*1 =-.5 BA 01 1 => 1 10 00 0 => 0 11 s ๐‘ฟ ๐Ÿ ๐‘ฟ ๐Ÿ ๐’€๐’‹ +1 โˆ’๐Ÿ. ๐Ÿ“ +1 โˆ’. ๐Ÿ“ 1/0 +1 โˆ’. ๐Ÿ“ โˆ’2 +๐Ÿ A B +๐Ÿ +๐Ÿ +๐Ÿ +๐Ÿ bin
  • 82. Neural Networks Training Example Step n=1 โ€“ Output โ€“ ๐‘บ ๐Ÿ ๐’€ ๐‘บ ๐Ÿ = = ๐‘ฉ๐‘ฐ๐‘ต ๐‘บ ๐Ÿ = ๐‘ฉ๐‘ฐ๐‘ต โˆ’. ๐Ÿ“ = ๐ŸŽ ๐’ƒ๐’Š๐’ ๐’” = +๐Ÿ, ๐’” โ‰ฅ ๐ŸŽ ๐ŸŽ, ๐’” < ๐ŸŽ BA 01 1 => 1 10 00 0 => 0 11 s ๐‘ฟ ๐Ÿ ๐‘ฟ ๐Ÿ ๐’€๐’‹ +1 โˆ’๐Ÿ. ๐Ÿ“ +1 โˆ’. ๐Ÿ“ 1/0 +1 โˆ’. ๐Ÿ“ โˆ’2 +๐Ÿ A B +๐Ÿ +๐Ÿ +๐Ÿ +๐Ÿ bin
  • 83. Neural Networks Training Example Step n=1 โ€“ SOP โ€“ ๐‘บ ๐Ÿ ๐‘บ ๐Ÿ=(+๐Ÿ๐’ƒ ๐Ÿ+๐‘ฟ ๐Ÿ ๐‘พ ๐Ÿ+๐‘ฟ ๐Ÿ ๐‘พ ๐Ÿ’) =+1*-.5+0*1+1*1 =.5 BA 01 1 => 1 10 00 0 => 0 11 s ๐‘ฟ ๐Ÿ ๐‘ฟ ๐Ÿ ๐’€๐’‹ +1 โˆ’๐Ÿ. ๐Ÿ“ +1 โˆ’. ๐Ÿ“ 1/0 +1 โˆ’. ๐Ÿ“ โˆ’2 +๐Ÿ A B +๐Ÿ +๐Ÿ +๐Ÿ +๐Ÿ bin
  • 84. Neural Networks Training Example Step n=1 โ€“ Output โ€“ ๐‘บ ๐Ÿ ๐’€ ๐‘บ2 = = ๐‘ฉ๐‘ฐ๐‘ต ๐‘บ2 = ๐‘ฉ๐‘ฐ๐‘ต . ๐Ÿ“ = 1 ๐’ƒ๐’Š๐’ ๐’” = +๐Ÿ, ๐’” โ‰ฅ ๐ŸŽ โˆ’๐Ÿ, ๐’” < ๐ŸŽ BA 01 1 => 1 10 00 0 => 0 11 s ๐‘ฟ ๐Ÿ ๐‘ฟ ๐Ÿ ๐’€๐’‹ +1 โˆ’๐Ÿ. ๐Ÿ“ +1 โˆ’. ๐Ÿ“ 1/0 +1 โˆ’. ๐Ÿ“ โˆ’2 +๐Ÿ A B +๐Ÿ +๐Ÿ +๐Ÿ +๐Ÿ bin
  • 85. Neural Networks Training Example Step n=1 โ€“ SOP โ€“ ๐‘บ ๐Ÿ‘ ๐‘บ ๐Ÿ‘=(+๐Ÿ๐’ƒ ๐Ÿ‘+๐‘บ ๐Ÿ ๐‘พ ๐Ÿ“+๐‘บ ๐Ÿ ๐‘พ ๐Ÿ”) =+1*-.5+0*-2+1*1 =.5 BA 01 1 => 1 10 00 0 => 0 11 s ๐‘ฟ ๐Ÿ ๐‘ฟ ๐Ÿ ๐’€๐’‹ +1 โˆ’๐Ÿ. ๐Ÿ“ +1 โˆ’. ๐Ÿ“ 1/0 +1 โˆ’. ๐Ÿ“ โˆ’2 +๐Ÿ A B +๐Ÿ +๐Ÿ +๐Ÿ +๐Ÿ bin
  • 86. Neural Networks Training Example Step n=1 โ€“ Output โ€“ ๐‘บ ๐Ÿ‘ ๐’€ ๐‘บ3 = = ๐‘ฉ๐‘ฐ๐‘ต ๐‘บ3 = ๐‘ฉ๐‘ฐ๐‘ต . ๐Ÿ“ = 1 ๐’ƒ๐’Š๐’ ๐’” = +๐Ÿ, ๐’” โ‰ฅ ๐ŸŽ ๐ŸŽ, ๐’” < ๐ŸŽ BA 01 1 => 1 10 00 0 => 0 11 s ๐‘ฟ ๐Ÿ ๐‘ฟ ๐Ÿ ๐’€๐’‹ +1 โˆ’๐Ÿ. ๐Ÿ“ +1 โˆ’. ๐Ÿ“ 1/0 +1 โˆ’. ๐Ÿ“ โˆ’2 +๐Ÿ A B +๐Ÿ +๐Ÿ +๐Ÿ +๐Ÿ bin
  • 87. Neural Networks Training Example Step n=1 - Output ๐’€ ๐’ = ๐’€ ๐Ÿ = ๐’€ ๐‘บ3 = 1 BA 01 1 => 1 10 00 0 => 0 11 s ๐‘ฟ ๐Ÿ ๐‘ฟ ๐Ÿ ๐’€๐’‹ +1 โˆ’๐Ÿ. ๐Ÿ“ +1 โˆ’. ๐Ÿ“ 1/0 +1 โˆ’. ๐Ÿ“ โˆ’2 +๐Ÿ A B +๐Ÿ +๐Ÿ +๐Ÿ +๐Ÿ bin
  • 88. Neural Networks Training Example Step n=1 Predicted Vs. Desired ๐’€ ๐’ = ๐’€ ๐Ÿ = 1 ๐ ๐’ = ๐’… ๐Ÿ = 1 โˆต ๐’€ ๐’ = ๐’… ๐’ โˆด Weights are Correct. No Adaptation BA 01 1 => 1 10 00 0 => 0 11 s ๐‘ฟ ๐Ÿ ๐‘ฟ ๐Ÿ ๐’€๐’‹ +1 โˆ’๐Ÿ. ๐Ÿ“ +1 โˆ’. ๐Ÿ“ 1/0 +1 โˆ’. ๐Ÿ“ โˆ’2 +๐Ÿ A B +๐Ÿ +๐Ÿ +๐Ÿ +๐Ÿ bin
  • 89. Neural Networks Training Example Step n=2 โ€ข In each step in the solution, the parameters of the neural network must be known. โ€ข Parameters of step n=2: ฮท = .001 ๐‘‹ ๐‘› = ๐‘‹ 2 = +1, +1, +1,0, 0 ๐‘Š ๐‘› = ๐‘Š 2 = ๐‘Š 1 = ๐‘1, ๐‘2, ๐‘3, ๐‘ค1, ๐‘ค1, ๐‘ค2, ๐‘ค3, ๐‘ค4, ๐‘ค5, ๐‘ค6 = โˆ’1.5, โˆ’.5, โˆ’.5, 1, 1, 1, 1, โˆ’2, 1 ๐‘‘ ๐‘› = ๐‘‘ 2 = 0 BA 01 1 => 1 10 00 0 => 0 11
  • 90. Neural Networks Training Example Step n=2 s ๐‘ฟ ๐Ÿ ๐‘ฟ ๐Ÿ ๐’€๐’‹ +1 โˆ’๐Ÿ. ๐Ÿ“ +1 โˆ’. ๐Ÿ“ 1/0 +1 โˆ’. ๐Ÿ“ โˆ’๐Ÿ +๐Ÿ A B +๐Ÿ +๐Ÿ +๐Ÿ +๐Ÿ bin BA 01 1 => 1 10 00 0 => 0 11
  • 91. Neural Networks Training Example Step n=2 โ€“ SOP โ€“ ๐‘บ ๐Ÿ ๐‘บ ๐Ÿ=(+๐Ÿ๐’ƒ ๐Ÿ+๐‘ฟ ๐Ÿ ๐‘พ ๐Ÿ+๐‘ฟ ๐Ÿ ๐‘พ ๐Ÿ‘) =+1*-1.5+0*1+0*1 =-1.5 BA 01 1 => 1 10 00 0 => 0 11 s ๐‘ฟ ๐Ÿ ๐‘ฟ ๐Ÿ ๐’€๐’‹ +1 โˆ’๐Ÿ. ๐Ÿ“ +1 โˆ’. ๐Ÿ“ 1/0 +1 โˆ’. ๐Ÿ“ โˆ’๐Ÿ +๐Ÿ A B +๐Ÿ +๐Ÿ +๐Ÿ +๐Ÿ bin
  • 92. Neural Networks Training Example Step n=2 โ€“ Output โ€“ ๐‘บ ๐Ÿ ๐’€ ๐‘บ ๐Ÿ = = ๐‘ฉ๐‘ฐ๐‘ต ๐‘บ ๐Ÿ = ๐‘ฉ๐‘ฐ๐‘ต โˆ’๐Ÿ. ๐Ÿ“ = ๐ŸŽ ๐’ƒ๐’Šn ๐’” = +๐Ÿ, ๐’” โ‰ฅ ๐ŸŽ ๐ŸŽ, ๐’” < ๐ŸŽ BA 01 1 => 1 10 00 0 => 0 11 s ๐‘ฟ ๐Ÿ ๐‘ฟ ๐Ÿ ๐’€๐’‹ +1 โˆ’๐Ÿ. ๐Ÿ“ +1 โˆ’. ๐Ÿ“ 1/0 +1 โˆ’. ๐Ÿ“ โˆ’๐Ÿ +๐Ÿ A B +๐Ÿ +๐Ÿ +๐Ÿ +๐Ÿ bin
  • 93. Neural Networks Training Example Step n=2 โ€“ SOP โ€“ ๐‘บ ๐Ÿ ๐‘บ ๐Ÿ=(+๐Ÿ๐’ƒ ๐Ÿ+๐‘ฟ ๐Ÿ ๐‘พ ๐Ÿ+๐‘ฟ ๐Ÿ ๐‘พ ๐Ÿ’) =+1*-.5+0*1+0*1 =-.5 BA 01 1 => 1 10 00 0 => 0 11 s ๐‘ฟ ๐Ÿ ๐‘ฟ ๐Ÿ ๐’€๐’‹ +1 โˆ’๐Ÿ. ๐Ÿ“ +1 โˆ’. ๐Ÿ“ 1/0 +1 โˆ’. ๐Ÿ“ โˆ’๐Ÿ +๐Ÿ A B +๐Ÿ +๐Ÿ +๐Ÿ +๐Ÿ bin
  • 94. Neural Networks Training Example Step n=2 โ€“ Output โ€“ ๐‘บ ๐Ÿ ๐’€ ๐‘บ2 = = ๐‘บ๐‘ฎ๐‘ต ๐‘บ2 = ๐‘บ๐‘ฎ๐‘ต โˆ’. ๐Ÿ“ =0 ๐’ƒ๐’Š๐’ ๐’” = +๐Ÿ, ๐’” โ‰ฅ ๐ŸŽ โˆ’๐Ÿ, ๐’” < ๐ŸŽ BA 01 1 => 1 10 00 0 => 0 11 s ๐‘ฟ ๐Ÿ ๐‘ฟ ๐Ÿ ๐’€๐’‹ +1 โˆ’๐Ÿ. ๐Ÿ“ +1 โˆ’. ๐Ÿ“ 1/0 +1 โˆ’. ๐Ÿ“ โˆ’๐Ÿ +๐Ÿ A B +๐Ÿ +๐Ÿ +๐Ÿ +๐Ÿ bin
  • 95. Neural Networks Training Example Step n=2 โ€“ SOP โ€“ ๐‘บ ๐Ÿ‘ ๐‘บ ๐Ÿ‘=(+๐Ÿ๐’ƒ ๐Ÿ‘+๐‘บ ๐Ÿ ๐‘พ ๐Ÿ“+๐‘บ ๐Ÿ ๐‘พ ๐Ÿ”) =+1*-.5+0*-2+0*1 =-.5 BA 01 1 => 1 10 00 0 => 0 11 s ๐‘ฟ ๐Ÿ ๐‘ฟ ๐Ÿ ๐’€๐’‹ +1 โˆ’๐Ÿ. ๐Ÿ“ +1 โˆ’. ๐Ÿ“ 1/0 +1 โˆ’. ๐Ÿ“ โˆ’๐Ÿ +๐Ÿ A B +๐Ÿ +๐Ÿ +๐Ÿ +๐Ÿ bin
  • 96. Neural Networks Training Example Step n=2 โ€“ Output โ€“ ๐‘บ ๐Ÿ‘ ๐’€ ๐‘บ3 = = ๐‘ฉ๐‘ฐ๐‘ต ๐‘บ3 = ๐‘ฉ๐‘ฐ๐‘ต โˆ’. ๐Ÿ“ = ๐ŸŽ ๐’ƒ๐’Š๐’ ๐’” = +๐Ÿ, ๐’” โ‰ฅ ๐ŸŽ ๐ŸŽ, ๐’” < ๐ŸŽ BA 01 1 => 1 10 00 0 => 0 11 s ๐‘ฟ ๐Ÿ ๐‘ฟ ๐Ÿ ๐’€๐’‹ +1 โˆ’๐Ÿ. ๐Ÿ“ +1 โˆ’. ๐Ÿ“ 1/0 +1 โˆ’. ๐Ÿ“ โˆ’๐Ÿ +๐Ÿ A B +๐Ÿ +๐Ÿ +๐Ÿ +๐Ÿ bin
  • 97. Neural Networks Training Example Step n=2 - Output ๐’€ ๐’ = ๐’€ ๐Ÿ = ๐’€ ๐‘บ3 = ๐ŸŽ BA 01 1 => 1 10 00 0 => 0 11 s ๐‘ฟ ๐Ÿ ๐‘ฟ ๐Ÿ ๐’€๐’‹ +1 โˆ’๐Ÿ. ๐Ÿ“ +1 โˆ’. ๐Ÿ“ 1/0 +1 โˆ’. ๐Ÿ“ โˆ’๐Ÿ +๐Ÿ A B +๐Ÿ +๐Ÿ +๐Ÿ +๐Ÿ bin
  • 98. Neural Networks Training Example Step n=2 Predicted Vs. Desired ๐’€ ๐’ = ๐’€ ๐Ÿ = ๐ŸŽ ๐ ๐’ = ๐’… ๐Ÿ = ๐ŸŽ โˆต ๐’€ ๐’ = ๐’… ๐’ โˆด Weights are Correct. No Adaptation BA 01 1 => 1 10 00 0 => 0 11 s ๐‘ฟ ๐Ÿ ๐‘ฟ ๐Ÿ ๐’€๐’‹ +1 โˆ’๐Ÿ. ๐Ÿ“ +1 โˆ’. ๐Ÿ“ 1/0 +1 โˆ’. ๐Ÿ“ โˆ’๐Ÿ +๐Ÿ A B +๐Ÿ +๐Ÿ +๐Ÿ +๐Ÿ bin
  • 99. Neural Networks Training Example Step n=3 โ€ข In each step in the solution, the parameters of the neural network must be known. โ€ข Parameters of step n=3: ฮท = .001 ๐‘‹ ๐‘› = ๐‘‹ 3 = +1, +1, +1,1, 1 ๐‘Š ๐‘› = ๐‘Š 3 = ๐‘Š 2 = ๐‘1, ๐‘2, ๐‘3, ๐‘ค1, ๐‘ค1, ๐‘ค2, ๐‘ค3, ๐‘ค4, ๐‘ค5, ๐‘ค6 = โˆ’1.5, โˆ’.5, โˆ’.5, 1, 1, 1, 1, โˆ’2, 1 ๐‘‘ ๐‘› = ๐‘‘ 3 = 0 BA 01 1 => 1 10 00 0 => 0 11
  • 100. Neural Networks Training Example Step n=3 s ๐‘ฟ ๐Ÿ ๐‘ฟ ๐Ÿ ๐’€๐’‹ +1 โˆ’๐Ÿ. ๐Ÿ“ +1 โˆ’. ๐Ÿ“ 1/0 +1 โˆ’. ๐Ÿ“ โˆ’๐Ÿ +๐Ÿ A B +๐Ÿ +๐Ÿ +๐Ÿ +๐Ÿ bin BA 01 1 => 1 10 00 0 => 0 11
  • 101. Neural Networks Training Example Step n=3 โ€“ SOP โ€“ ๐‘บ ๐Ÿ ๐‘บ ๐Ÿ=(+๐Ÿ๐’ƒ ๐Ÿ+๐‘ฟ ๐Ÿ ๐‘พ ๐Ÿ+๐‘ฟ ๐Ÿ ๐‘พ ๐Ÿ‘) =+1*-1.5+1*1+1*1 =.5 BA 01 1 => 1 10 00 0 => 0 11 s ๐‘ฟ ๐Ÿ ๐‘ฟ ๐Ÿ ๐’€๐’‹ +1 โˆ’๐Ÿ. ๐Ÿ“ +1 โˆ’. ๐Ÿ“ 1/0 +1 โˆ’. ๐Ÿ“ โˆ’๐Ÿ +๐Ÿ A B +๐Ÿ +๐Ÿ +๐Ÿ +๐Ÿ bin
  • 102. Neural Networks Training Example Step n=3 โ€“ Output โ€“ ๐‘บ ๐Ÿ ๐’€ ๐‘บ ๐Ÿ = = ๐‘ฉ๐‘ฐ๐‘ต ๐‘บ ๐Ÿ = ๐‘ฉ๐‘ฐ๐‘ต . ๐Ÿ“ = 1 ๐’ƒ๐’Š๐’ ๐’” = +๐Ÿ, ๐’” โ‰ฅ ๐ŸŽ ๐ŸŽ, ๐’” < ๐ŸŽ BA 01 1 => 1 10 00 0 => 0 11 s ๐‘ฟ ๐Ÿ ๐‘ฟ ๐Ÿ ๐’€๐’‹ +1 โˆ’๐Ÿ. ๐Ÿ“ +1 โˆ’. ๐Ÿ“ 1/0 +1 โˆ’. ๐Ÿ“ โˆ’๐Ÿ +๐Ÿ A B +๐Ÿ +๐Ÿ +๐Ÿ +๐Ÿ bin
  • 103. Neural Networks Training Example Step n=3 โ€“ SOP โ€“ ๐‘บ ๐Ÿ ๐‘บ ๐Ÿ=(+๐Ÿ๐’ƒ ๐Ÿ+๐‘ฟ ๐Ÿ ๐‘พ ๐Ÿ+๐‘ฟ ๐Ÿ ๐‘พ ๐Ÿ’) =+1*-.5+1*1+1*1 =1.5 BA 01 1 => 1 10 00 0 => 0 11 s ๐‘ฟ ๐Ÿ ๐‘ฟ ๐Ÿ ๐’€๐’‹ +1 โˆ’๐Ÿ. ๐Ÿ“ +1 โˆ’. ๐Ÿ“ 1/0 +1 โˆ’. ๐Ÿ“ โˆ’๐Ÿ +๐Ÿ A B +๐Ÿ +๐Ÿ +๐Ÿ +๐Ÿ bin
  • 104. Neural Networks Training Example Step n=3 โ€“ Output โ€“ ๐‘บ ๐Ÿ ๐’€ ๐‘บ2 = = ๐‘ฉ๐‘ฐ๐‘ต ๐‘บ2 = ๐‘ฉ๐‘ฐ๐‘ต ๐Ÿ. ๐Ÿ“ = 1 ๐’ƒ๐’Š๐’ ๐’” = +๐Ÿ, ๐’” โ‰ฅ ๐ŸŽ โˆ’๐Ÿ, ๐’” < ๐ŸŽ BA 01 1 => 1 10 00 0 => 0 11 s ๐‘ฟ ๐Ÿ ๐‘ฟ ๐Ÿ ๐’€๐’‹ +1 โˆ’๐Ÿ. ๐Ÿ“ +1 โˆ’. ๐Ÿ“ 1/0 +1 โˆ’. ๐Ÿ“ โˆ’๐Ÿ +๐Ÿ A B +๐Ÿ +๐Ÿ +๐Ÿ +๐Ÿ bin
  • 105. Neural Networks Training Example Step n=3 โ€“ SOP โ€“ ๐‘บ ๐Ÿ‘ ๐‘บ ๐Ÿ‘=(+๐Ÿ๐’ƒ ๐Ÿ‘+๐‘บ ๐Ÿ ๐‘พ ๐Ÿ“+๐‘บ ๐Ÿ ๐‘พ ๐Ÿ”) =+1*-.5+1*-2+1*1 =-1.5 BA 01 1 => 1 10 00 0 => 0 11 s ๐‘ฟ ๐Ÿ ๐‘ฟ ๐Ÿ ๐’€๐’‹ +1 โˆ’๐Ÿ. ๐Ÿ“ +1 โˆ’. ๐Ÿ“ 1/0 +1 โˆ’. ๐Ÿ“ โˆ’๐Ÿ +๐Ÿ A B +๐Ÿ +๐Ÿ +๐Ÿ +๐Ÿ bin
  • 106. Neural Networks Training Example Step n=3 โ€“ Output โ€“ ๐‘บ ๐Ÿ‘ ๐’€ ๐‘บ3 = = ๐‘ฉ๐‘ฐ๐‘ต ๐‘บ3 = ๐‘ฉ๐‘ฐ๐‘ต โˆ’๐Ÿ. ๐Ÿ“ = ๐ŸŽ ๐’ƒ๐’Š๐’ ๐’” = +๐Ÿ, ๐’” โ‰ฅ ๐ŸŽ ๐ŸŽ, ๐’” < ๐ŸŽ BA 01 1 => 1 10 00 0 => 0 11 s ๐‘ฟ ๐Ÿ ๐‘ฟ ๐Ÿ ๐’€๐’‹ +1 โˆ’๐Ÿ. ๐Ÿ“ +1 โˆ’. ๐Ÿ“ 1/0 +1 โˆ’. ๐Ÿ“ โˆ’๐Ÿ +๐Ÿ A B +๐Ÿ +๐Ÿ +๐Ÿ +๐Ÿ bin
  • 107. Neural Networks Training Example Step n=3 - Output ๐’€ ๐’ = ๐’€ ๐Ÿ‘ = ๐’€ ๐‘บ3 = ๐ŸŽ BA 01 1 => 1 10 00 0 => 0 11 s ๐‘ฟ ๐Ÿ ๐‘ฟ ๐Ÿ ๐’€๐’‹ +1 โˆ’๐Ÿ. ๐Ÿ“ +1 โˆ’. ๐Ÿ“ 1/0 +1 โˆ’. ๐Ÿ“ โˆ’๐Ÿ +๐Ÿ A B +๐Ÿ +๐Ÿ +๐Ÿ +๐Ÿ bin
  • 108. Neural Networks Training Example Step n=3 Predicted Vs. Desired ๐’€ ๐’ = ๐’€ ๐Ÿ‘ = ๐ŸŽ ๐ ๐’ = ๐’… ๐Ÿ‘ = ๐ŸŽ โˆต ๐’€ ๐’ = ๐’… ๐’ โˆด Weights are Correct. No Adaptation BA 01 1 => 1 10 00 0 => 0 11 s ๐‘ฟ ๐Ÿ ๐‘ฟ ๐Ÿ ๐’€๐’‹ +1 โˆ’๐Ÿ. ๐Ÿ“ +1 โˆ’. ๐Ÿ“ 1/0 +1 โˆ’. ๐Ÿ“ โˆ’๐Ÿ +๐Ÿ A B +๐Ÿ +๐Ÿ +๐Ÿ +๐Ÿ bin
  • 109. Final Weights s ๐‘ฟ ๐Ÿ ๐‘ฟ ๐Ÿ ๐’€๐’‹ +1 โˆ’๐Ÿ. ๐Ÿ“ +1 โˆ’. ๐Ÿ“ 1/0 +1 โˆ’. ๐Ÿ“ โˆ’๐Ÿ +๐Ÿ A B +๐Ÿ +๐Ÿ +๐Ÿ +๐Ÿ bin Current weights predicted the desired outputs.