SlideShare a Scribd company logo
1 of 97
Download to read offline
Artificial Neural Networks (ANNs)
Step-By-Step Training & Testing
Example 2
MENOUFIA UNIVERSITY
FACULTY OF COMPUTERS AND INFORMATION
ALL DEPARTMENTS
ARTIFICIAL INTELLIGENCE
‫المنوفية‬ ‫جامعة‬
‫والمعلومات‬ ‫الحاسبات‬ ‫كلية‬
‫األقسام‬ ‫جميع‬
‫الذكاء‬‫اإلصطناعي‬
‫المنوفية‬ ‫جامعة‬
Ahmed Fawzy Gad
ahmed.fawzy@ci.menofia.edu.eg
Classification Example
𝑭 𝟐𝑭 𝟏
16.8121
C1
15.2114
9.4210
C2
8.1195
Neural Networks
𝑭 𝟐𝑭 𝟏
16.8121
C1
15.2114
9.4210
C2
8.1195
Input Hidden Output
Neural Networks
𝑭 𝟐𝑭 𝟏
16.8121
C1
15.2114
9.4210
C2
8.1195
0
50
100
150
200
250
0 5 10 15 20
Input Layer
Input Output
𝑭 𝟏
𝑭 𝟐
𝑭 𝟐𝑭 𝟏
16.8121
C1
15.2114
9.4210
C2
8.1195
0
50
100
150
200
250
0 5 10 15 20
Output Layer
Input Output
C1/C2
𝒀𝒋
𝑭 𝟐𝑭 𝟏
16.8121
C1
15.2114
9.4210
C2
8.1195
0
50
100
150
200
250
0 5 10 15 20
𝑭 𝟏
𝑭 𝟐
Weights
Input Output
Weights=𝑾𝒊
𝑭 𝟐𝑭 𝟏
16.8121
C1
15.2114
9.4210
C2
8.1195
0
50
100
150
200
250
0 5 10 15 20
C1/C2
𝒀𝒋
𝑭 𝟏
𝑭 𝟐
𝑾 𝟏
𝑾 𝟐
Activation Function
Input Output
𝑭 𝟐𝑭 𝟏
16.8121
C1
15.2114
9.4210
C2
8.1195
0
50
100
150
200
250
0 5 10 15 20
C1/C2
𝒀𝒋
𝑭 𝟏
𝑭 𝟐
𝑾 𝟏
𝑾 𝟐
Activation Function
Input Output
𝑭 𝟐𝑭 𝟏
16.8121
C1
15.2114
9.4210
C2
8.1195
0
50
100
150
200
250
0 5 10 15 20
C1/C2
𝒀𝒋
𝑭 𝟏
𝑭 𝟐
𝑾 𝟏
𝑾 𝟐
Activation Function
Input Output
𝑭 𝟐𝑭 𝟏
16.8121
C1
15.2114
9.4210
C2
8.1195
0
50
100
150
200
250
0 5 10 15 20
C1/C2
𝒀𝒋
𝑭 𝟏
𝑭 𝟐
𝑾 𝟏
𝑾 𝟐
Activation Function
Components
Input Output
𝑭 𝟐𝑭 𝟏
16.8121
C1
15.2114
9.4210
C2
8.1195
0
50
100
150
200
250
0 5 10 15 20
C1/C2
𝒀𝒋
𝑭 𝟏
𝑭 𝟐
𝑾 𝟏
𝑾 𝟐
Activation Function
Inputs
Input Output
s
𝑭 𝟐𝑭 𝟏
16.8121
C1
15.2114
9.4210
C2
8.1195
0
50
100
150
200
250
0 5 10 15 20
C1/C2
𝒀𝒋
𝑭 𝟏
𝑭 𝟐
𝑾 𝟏
𝑾 𝟐
Activation Function
Inputs
Input Output
ss=SOP(𝑿𝒊, 𝑾𝒊)
𝑭 𝟐𝑭 𝟏
16.8121
C1
15.2114
9.4210
C2
8.1195
0
50
100
150
200
250
0 5 10 15 20
C1/C2
𝒀𝒋
𝑭 𝟏
𝑭 𝟐
𝑾 𝟏
𝑾 𝟐
Activation Function
Inputs
Input Output
ss=SOP(𝑿𝒊, 𝑾𝒊)
𝑿𝒊=Inputs 𝑾𝒊=Weights
𝑭 𝟐𝑭 𝟏
16.8121
C1
15.2114
9.4210
C2
8.1195
0
50
100
150
200
250
0 5 10 15 20
C1/C2
𝒀𝒋
𝑭 𝟏
𝑭 𝟐
𝑾 𝟏
𝑾 𝟐
Activation Function
Inputs
Input Output
ss=SOP(𝑿𝒊, 𝑾𝒊)
𝑿𝒊=Inputs 𝑾𝒊=Weights
𝑿 𝟏
𝑿 𝟐
𝑭 𝟐𝑭 𝟏
16.8121
C1
15.2114
9.4210
C2
8.1195
0
50
100
150
200
250
0 5 10 15 20
C1/C2
𝒀𝒋
𝑭 𝟏
𝑭 𝟐
𝑾 𝟏
𝑾 𝟐
Activation Function
Inputs
Input Output
ss=SOP(𝑿𝒊, 𝑾𝒊)
𝑿𝒊=Inputs 𝑾𝒊=Weights
S= 𝟏
𝒎
𝑿𝒊 𝑾𝒊
𝑭 𝟐𝑭 𝟏
16.8121
C1
15.2114
9.4210
C2
8.1195
0
50
100
150
200
250
0 5 10 15 20
𝑿 𝟏
𝑿 𝟐
C1/C2
𝒀𝒋
𝑭 𝟏
𝑭 𝟐
𝑾 𝟏
𝑾 𝟐
Activation Function
Inputs
Input Output
ss=SOP(𝑿𝒊, 𝑾𝒊)
𝑿𝒊=Inputs 𝑾𝒊=Weights s=(𝑿 𝟏 𝑾 𝟏+𝑿 𝟐 𝑾 𝟐)
𝑭 𝟐𝑭 𝟏
16.8121
C1
15.2114
9.4210
C2
8.1195
0
50
100
150
200
250
0 5 10 15 20
𝑿 𝟏
𝑿 𝟐
C1/C2
𝒀𝒋
𝑭 𝟏
𝑭 𝟐
𝑾 𝟏
𝑾 𝟐
Activation Function
Outputs
Input Output
F(s)s Class Label
𝑭 𝟐𝑭 𝟏
16.8121
C1
15.2114
9.4210
C2
8.1195
0
50
100
150
200
250
0 5 10 15 20
𝑿 𝟏
𝑿 𝟐
C1/C2
𝒀𝒋
𝑭 𝟏
𝑭 𝟐
𝑾 𝟏
𝑾 𝟐
Activation Functions
Piecewise
Linear Sigmoid Signum
Activation Functions
Which activation function to use?
Outputs
Class
Labels
Activation
Function
TWO Class
Labels
TWO
Outputs
One that gives two outputs.
Which activation function to use?
𝑪𝒋𝒀𝒋
𝑭 𝟐𝑭 𝟏
16.8121
C1
15.2114
9.4210
C2
8.1195
Activation Functions
Piecewise
Linear Sigmoid SignumSignum
Activation Function
Input Output
F(s)s sgn
𝑭 𝟐𝑭 𝟏
16.8121
C1
15.2114
9.4210
C2
8.1195
0
50
100
150
200
250
0 5 10 15 20
𝑿 𝟏
𝑿 𝟐
C1/C2
𝒀𝒋
𝑭 𝟏
𝑭 𝟐
𝑾 𝟏
𝑾 𝟐
Bias
Input Output
F(s)s sgn
=+1𝑿 𝟎
W0
s=(𝑿 𝟏 𝑾 𝟏+𝑿 𝟐 𝑾 𝟐)
𝑭 𝟐𝑭 𝟏
16.8121
C1
15.2114
9.4210
C2
8.1195
𝑿 𝟏
𝑿 𝟐
C1/C2
𝒀𝒋
𝑭 𝟏
𝑭 𝟐
𝑾 𝟏
𝑾 𝟐
Bias
Input Output
F(s)s sgn
s=(𝑿 𝟏 𝑾 𝟏+𝑿 𝟐 𝑾 𝟐)
s=(𝑿 𝟎 𝑾 𝟎+𝑿 𝟏 𝑾 𝟏+𝑿 𝟐 𝑾 𝟐)
𝑭 𝟐𝑭 𝟏
16.8121
C1
15.2114
9.4210
C2
8.1195
=+1𝑿 𝟎
W0
𝑿 𝟏
𝑿 𝟐
C1/C2
𝒀𝒋
𝑭 𝟏
𝑭 𝟐
𝑾 𝟏
𝑾 𝟐
Bias
Input Output
F(s)s sgn
s=(𝑿 𝟏 𝑾 𝟏+𝑿 𝟐 𝑾 𝟐)
s=(𝑿 𝟎 𝑾 𝟎+𝑿 𝟏 𝑾 𝟏+𝑿 𝟐 𝑾 𝟐)
𝑿 𝟎 = +𝟏
𝑭 𝟐𝑭 𝟏
16.8121
C1
15.2114
9.4210
C2
8.1195
=+1𝑿 𝟎
W0
𝑿 𝟏
𝑿 𝟐
C1/C2
𝒀𝒋
𝑭 𝟏
𝑭 𝟐
𝑾 𝟏
𝑾 𝟐
Bias
Input Output
F(s)s sgn
s=(𝑿 𝟏 𝑾 𝟏+𝑿 𝟐 𝑾 𝟐)
s=(𝑿 𝟎 𝑾 𝟎+𝑿 𝟏 𝑾 𝟏+𝑿 𝟐 𝑾 𝟐)
s=(+𝟏 ∗ 𝑾 𝟎+𝑿 𝟏 𝑾 𝟏+𝑿 𝟐 𝑾 𝟐)
𝑭 𝟐𝑭 𝟏
16.8121
C1
15.2114
9.4210
C2
8.1195
=+1𝑿 𝟎
W0
𝑿 𝟏
𝑿 𝟐
C1/C2
𝒀𝒋
𝑭 𝟏
𝑭 𝟐
𝑾 𝟏
𝑾 𝟐
Bias
Input Output
F(s)s sgn
s=(𝑿 𝟏 𝑾 𝟏+𝑿 𝟐 𝑾 𝟐)
s=(𝑿 𝟎 𝑾 𝟎+𝑿 𝟏 𝑾 𝟏+𝑿 𝟐 𝑾 𝟐)
s=(𝑾 𝟎+𝑿 𝟏 𝑾 𝟏+𝑿 𝟐 𝑾 𝟐)
𝑭 𝟐𝑭 𝟏
16.8121
C1
15.2114
9.4210
C2
8.1195
=+1𝑿 𝟎
W0
𝑿 𝟏
𝑿 𝟐
C1/C2
𝒀𝒋
𝑭 𝟏
𝑭 𝟐
𝑾 𝟏
𝑾 𝟐
Bias Importance
Input Output
F(s)s sgn
s=(𝑿 𝟏 𝑾 𝟏+𝑿 𝟐 𝑾 𝟐)
s=(𝑾 𝟎+𝑿 𝟏 𝑾 𝟏+𝑿 𝟐 𝑾 𝟐)
𝑭 𝟐𝑭 𝟏
16.8121
C1
15.2114
9.4210
C2
8.1195
=+1𝑿 𝟎
W0
𝑿 𝟏
𝑿 𝟐
C1/C2
𝒀𝒋
𝑭 𝟏
𝑭 𝟐
𝑾 𝟏
𝑾 𝟐
Bias Importance
Input Output
F(s)s sgn
s=(𝑿 𝟏 𝑾 𝟏+𝑿 𝟐 𝑾 𝟐)
s=(𝑾 𝟎+𝑿 𝟏 𝑾 𝟏+𝑿 𝟐 𝑾 𝟐)
X
Y
𝑭 𝟐𝑭 𝟏
16.8121
C1
15.2114
9.4210
C2
8.1195
=+1𝑿 𝟎
W0
𝑿 𝟏
𝑿 𝟐
C1/C2
𝒀𝒋
𝑭 𝟏
𝑭 𝟐
𝑾 𝟏
𝑾 𝟐
Bias Importance
Input Output
F(s)s sgn
s=(𝑿 𝟏 𝑾 𝟏+𝑿 𝟐 𝑾 𝟐)
s=(𝑾 𝟎+𝑿 𝟏 𝑾 𝟏+𝑿 𝟐 𝑾 𝟐)
X
Y y=ax+b
𝑭 𝟐𝑭 𝟏
16.8121
C1
15.2114
9.4210
C2
8.1195
=+1𝑿 𝟎
W0
𝑿 𝟏
𝑿 𝟐
C1/C2
𝒀𝒋
𝑭 𝟏
𝑭 𝟐
𝑾 𝟏
𝑾 𝟐
Bias Importance
Input Output
F(s)s sgn
s=(𝑿 𝟏 𝑾 𝟏+𝑿 𝟐 𝑾 𝟐)
s=(𝑾 𝟎+𝑿 𝟏 𝑾 𝟏+𝑿 𝟐 𝑾 𝟐)
X
Y y=x+b
𝑭 𝟐𝑭 𝟏
16.8121
C1
15.2114
9.4210
C2
8.1195
=+1𝑿 𝟎
W0
𝑿 𝟏
𝑿 𝟐
C1/C2
𝒀𝒋
𝑭 𝟏
𝑭 𝟐
𝑾 𝟏
𝑾 𝟐
Bias Importance
Input Output
F(s)s sgn
s=(𝑿 𝟏 𝑾 𝟏+𝑿 𝟐 𝑾 𝟐)
s=(𝑾 𝟎+𝑿 𝟏 𝑾 𝟏+𝑿 𝟐 𝑾 𝟐)
X
Y y=x+b
Y-Intercept
𝑭 𝟐𝑭 𝟏
16.8121
C1
15.2114
9.4210
C2
8.1195
=+1𝑿 𝟎
W0
𝑿 𝟏
𝑿 𝟐
C1/C2
𝒀𝒋
𝑭 𝟏
𝑭 𝟐
𝑾 𝟏
𝑾 𝟐
Bias Importance
Input Output
F(s)s sgn
s=(𝑿 𝟏 𝑾 𝟏+𝑿 𝟐 𝑾 𝟐)
s=(𝑾 𝟎+𝑿 𝟏 𝑾 𝟏+𝑿 𝟐 𝑾 𝟐)
X
Y y=x+b
Y-Intercept
b=0
𝑭 𝟐𝑭 𝟏
16.8121
C1
15.2114
9.4210
C2
8.1195
=+1𝑿 𝟎
W0
𝑿 𝟏
𝑿 𝟐
C1/C2
𝒀𝒋
𝑭 𝟏
𝑭 𝟐
𝑾 𝟏
𝑾 𝟐
Bias Importance
Input Output
F(s)s sgn
s=(𝑿 𝟏 𝑾 𝟏+𝑿 𝟐 𝑾 𝟐)
s=(𝑾 𝟎+𝑿 𝟏 𝑾 𝟏+𝑿 𝟐 𝑾 𝟐)
X
Y y=x+b
Y-Intercept
b=0
𝑭 𝟐𝑭 𝟏
16.8121
C1
15.2114
9.4210
C2
8.1195
=+1𝑿 𝟎
W0
𝑿 𝟏
𝑿 𝟐
C1/C2
𝒀𝒋
𝑭 𝟏
𝑭 𝟐
𝑾 𝟏
𝑾 𝟐
Bias Importance
Input Output
F(s)s sgn
s=(𝑿 𝟏 𝑾 𝟏+𝑿 𝟐 𝑾 𝟐)
s=(𝑾 𝟎+𝑿 𝟏 𝑾 𝟏+𝑿 𝟐 𝑾 𝟐)
X
Y y=x+b
Y-Intercept
b=+v
𝑭 𝟐𝑭 𝟏
16.8121
C1
15.2114
9.4210
C2
8.1195
=+1𝑿 𝟎
W0
𝑿 𝟏
𝑿 𝟐
C1/C2
𝒀𝒋
𝑭 𝟏
𝑭 𝟐
𝑾 𝟏
𝑾 𝟐
Bias Importance
Input Output
F(s)s sgn
s=(𝑿 𝟏 𝑾 𝟏+𝑿 𝟐 𝑾 𝟐)
s=(𝑾 𝟎+𝑿 𝟏 𝑾 𝟏+𝑿 𝟐 𝑾 𝟐)
X
Y y=x+b
Y-Intercept
b=+v
𝑭 𝟐𝑭 𝟏
16.8121
C1
15.2114
9.4210
C2
8.1195
=+1𝑿 𝟎
W0
𝑿 𝟏
𝑿 𝟐
C1/C2
𝒀𝒋
𝑭 𝟏
𝑭 𝟐
𝑾 𝟏
𝑾 𝟐
Bias Importance
Input Output
F(s)s sgn
s=(𝑿 𝟏 𝑾 𝟏+𝑿 𝟐 𝑾 𝟐)
s=(𝑾 𝟎+𝑿 𝟏 𝑾 𝟏+𝑿 𝟐 𝑾 𝟐)
X
Y y=x+b
Y-Intercept
b=-v
𝑭 𝟐𝑭 𝟏
16.8121
C1
15.2114
9.4210
C2
8.1195
=+1𝑿 𝟎
W0
𝑿 𝟏
𝑿 𝟐
C1/C2
𝒀𝒋
𝑭 𝟏
𝑭 𝟐
𝑾 𝟏
𝑾 𝟐
Bias Importance
Input Output
F(s)s sgn
s=(𝑿 𝟏 𝑾 𝟏+𝑿 𝟐 𝑾 𝟐)
s=(𝑾 𝟎+𝑿 𝟏 𝑾 𝟏+𝑿 𝟐 𝑾 𝟐)
X
Y y=x+b
Y-Intercept
b=-v
𝑭 𝟐𝑭 𝟏
16.8121
C1
15.2114
9.4210
C2
8.1195
=+1𝑿 𝟎
W0
𝑿 𝟏
𝑿 𝟐
C1/C2
𝒀𝒋
𝑭 𝟏
𝑭 𝟐
𝑾 𝟏
𝑾 𝟐
Bias Importance
Input Output
F(s)s sgn
s=(𝑿 𝟏 𝑾 𝟏+𝑿 𝟐 𝑾 𝟐)
s=(𝑾 𝟎+𝑿 𝟏 𝑾 𝟏+𝑿 𝟐 𝑾 𝟐)
Same Concept Applies to Bias
S= 𝟏
𝒎
𝑿𝒊 𝑾𝒊+BIAS
𝑭 𝟐𝑭 𝟏
16.8121
C1
15.2114
9.4210
C2
8.1195
=+1𝑿 𝟎
W0
𝑿 𝟏
𝑿 𝟐
C1/C2
𝒀𝒋
𝑭 𝟏
𝑭 𝟐
𝑾 𝟏
𝑾 𝟐
Bias Importance
Input Output
F(s)s sgn
s=(𝑿 𝟏 𝑾 𝟏+𝑿 𝟐 𝑾 𝟐)
s=(𝑾 𝟎+𝑿 𝟏 𝑾 𝟏+𝑿 𝟐 𝑾 𝟐)
S= 𝟏
𝒎
𝑿𝒊 𝑾𝒊+BIAS
𝑭 𝟐𝑭 𝟏
16.8121
C1
15.2114
9.4210
C2
8.1195
=+1𝑿 𝟎
W0
𝑿 𝟏
𝑿 𝟐
C1/C2
𝒀𝒋
𝑭 𝟏
𝑭 𝟐
𝑾 𝟏
𝑾 𝟐
Bias Importance
Input Output
F(s)s sgn
s=(𝑿 𝟏 𝑾 𝟏+𝑿 𝟐 𝑾 𝟐)
s=(𝑾 𝟎+𝑿 𝟏 𝑾 𝟏+𝑿 𝟐 𝑾 𝟐)
S= 𝟏
𝒎
𝑿𝒊 𝑾𝒊+BIAS
𝑭 𝟐𝑭 𝟏
16.8121
C1
15.2114
9.4210
C2
8.1195
=+1𝑿 𝟎
W0
𝑿 𝟏
𝑿 𝟐
C1/C2
𝒀𝒋
𝑭 𝟏
𝑭 𝟐
𝑾 𝟏
𝑾 𝟐
Bias Importance
Input Output
F(s)s sgn
s=(𝑿 𝟏 𝑾 𝟏+𝑿 𝟐 𝑾 𝟐)
s=(𝑾 𝟎+𝑿 𝟏 𝑾 𝟏+𝑿 𝟐 𝑾 𝟐)
S= 𝟏
𝒎
𝑿𝒊 𝑾𝒊+BIAS
𝑭 𝟐𝑭 𝟏
16.8121
C1
15.2114
9.4210
C2
8.1195
=+1𝑿 𝟎
W0
𝑿 𝟏
𝑿 𝟐
C1/C2
𝒀𝒋
𝑭 𝟏
𝑭 𝟐
𝑾 𝟏
𝑾 𝟐
Bias Importance
Input Output
F(s)s sgn
s=(𝑿 𝟏 𝑾 𝟏+𝑿 𝟐 𝑾 𝟐)
s=(𝑾 𝟎+𝑿 𝟏 𝑾 𝟏+𝑿 𝟐 𝑾 𝟐)
S= 𝟏
𝒎
𝑿𝒊 𝑾𝒊+BIAS
𝑭 𝟐𝑭 𝟏
16.8121
C1
15.2114
9.4210
C2
8.1195
=+1𝑿 𝟎
W0
𝑿 𝟏
𝑿 𝟐
C1/C2
𝒀𝒋
𝑭 𝟏
𝑭 𝟐
𝑾 𝟏
𝑾 𝟐
Learning Rate
F(s)s sgn
s=(𝑿 𝟎 𝑾 𝟎+𝑿 𝟏 𝑾 𝟏+𝑿 𝟐 𝑾 𝟐)
𝟎 ≤ η ≤ 𝟏=+1𝑿 𝟎
W0
𝑿 𝟏
𝑿 𝟐
𝑭 𝟏
𝑭 𝟐
𝑾 𝟏
𝑾 𝟐
C1/C2
𝒀𝒋
Summary of Parameters
Inputs 𝑿 𝒎
s=(𝑿 𝟎 𝑾 𝟎+𝑿 𝟏 𝑾 𝟏+𝑿 𝟐 𝑾 𝟐)
𝟎 ≤ η ≤ 𝟏
𝑿(𝒏)=(𝑿 𝟎, 𝑿 𝟏,𝑿 𝟐)
F(s)s sgn
=+1𝑿 𝟎
W0
𝑿 𝟏
𝑿 𝟐
𝑭 𝟏
𝑭 𝟐
𝑾 𝟏
𝑾 𝟐
C1/C2
𝒀𝒋
Summary of Parameters
Weights 𝑾 𝒎
s=(𝑿 𝟎 𝑾 𝟎+𝑿 𝟏 𝑾 𝟏+𝑿 𝟐 𝑾 𝟐)
𝟎 ≤ η ≤ 𝟏
W(𝒏)=(𝑾 𝟎, 𝑾 𝟏,𝑾 𝟐)
𝑿(𝒏)=(𝑿 𝟎, 𝑿 𝟏,𝑿 𝟐)
F(s)s sgn
=+1𝑿 𝟎
W0
𝑿 𝟏
𝑿 𝟐
𝑭 𝟏
𝑭 𝟐
𝑾 𝟏
𝑾 𝟐
C1/C2
𝒀𝒋
Summary of Parameters
Bias 𝒃
s=(𝑿 𝟎 𝑾 𝟎+𝑿 𝟏 𝑾 𝟏+𝑿 𝟐 𝑾 𝟐)
𝟎 ≤ η ≤ 𝟏
𝑿(𝒏)=(𝑿 𝟎, 𝑿 𝟏,𝑿 𝟐)
W(𝒏)=(𝑾 𝟎, 𝑾 𝟏,𝑾 𝟐)
F(s)s sgn
=+1𝑿 𝟎
W0
𝑿 𝟏
𝑿 𝟐
𝑭 𝟏
𝑭 𝟐
𝑾 𝟏
𝑾 𝟐
C1/C2
𝒀𝒋
Summary of Parameters
Sum Of Products (SOP) 𝒔
s=(𝑿 𝟎 𝑾 𝟎+𝑿 𝟏 𝑾 𝟏+𝑿 𝟐 𝑾 𝟐)
𝟎 ≤ η ≤ 𝟏
𝑿(𝒏)=(𝑿 𝟎, 𝑿 𝟏,𝑿 𝟐)
W(𝒏)=(𝑾 𝟎, 𝑾 𝟏,𝑾 𝟐)
F(s)s sgn
=+1𝑿 𝟎
W0
𝑿 𝟏
𝑿 𝟐
𝑭 𝟏
𝑭 𝟐
𝑾 𝟏
𝑾 𝟐
C1/C2
𝒀𝒋
Summary of Parameters
Activation Function 𝒔𝒈𝒏
s=(𝑿 𝟎 𝑾 𝟎+𝑿 𝟏 𝑾 𝟏+𝑿 𝟐 𝑾 𝟐)
𝟎 ≤ η ≤ 𝟏
𝑿(𝒏)=(𝑿 𝟎, 𝑿 𝟏,𝑿 𝟐)
W(𝒏)=(𝑾 𝟎, 𝑾 𝟏,𝑾 𝟐)
F(s)s sgn
=+1𝑿 𝟎
W0
𝑿 𝟏
𝑿 𝟐
𝑭 𝟏
𝑭 𝟐
𝑾 𝟏
𝑾 𝟐
C1/C2
𝒀𝒋
Summary of Parameters
Outputs 𝒀𝒋
s=(𝑿 𝟎 𝑾 𝟎+𝑿 𝟏 𝑾 𝟏+𝑿 𝟐 𝑾 𝟐)
𝟎 ≤ η ≤ 𝟏
𝑿(𝒏)=(𝑿 𝟎, 𝑿 𝟏,𝑿 𝟐)
W(𝒏)=(𝑾 𝟎, 𝑾 𝟏,𝑾 𝟐)
F(s)s sgn
=+1𝑿 𝟎
W0
𝑿 𝟏
𝑿 𝟐
𝑭 𝟏
𝑭 𝟐
𝑾 𝟏
𝑾 𝟐
C1/C2
𝒀𝒋
Summary of Parameters
Learning Rate η
s=(𝑿 𝟎 𝑾 𝟎+𝑿 𝟏 𝑾 𝟏+𝑿 𝟐 𝑾 𝟐)
𝟎 ≤ η ≤ 𝟏
𝑿(𝒏)=(𝑿 𝟎, 𝑿 𝟏,𝑿 𝟐)
W(𝒏)=(𝑾 𝟎, 𝑾 𝟏,𝑾 𝟐)
F(s)s sgn
=+1𝑿 𝟎
W0
𝑿 𝟏
𝑿 𝟐
𝑭 𝟏
𝑭 𝟐
𝑾 𝟏
𝑾 𝟐
C1/C2
𝒀𝒋
Other Parameters
Step n
s=(𝑿 𝟎 𝑾 𝟎+𝑿 𝟏 𝑾 𝟏+𝑿 𝟐 𝑾 𝟐)
𝟎 ≤ η ≤ 𝟏𝒏 = 𝟎, 𝟏, 𝟐, …
𝑿(𝒏)=(𝑿 𝟎, 𝑿 𝟏,𝑿 𝟐)
W(𝒏)=(𝑾 𝟎, 𝑾 𝟏,𝑾 𝟐)
F(s)s sgn
=+1𝑿 𝟎
W0
𝑿 𝟏
𝑿 𝟐
𝑭 𝟏
𝑭 𝟐
𝑾 𝟏
𝑾 𝟐
C1/C2
𝒀𝒋
Other Parameters
Desired Output 𝒅𝒋
s=(𝑿 𝟎 𝑾 𝟎+𝑿 𝟏 𝑾 𝟏+𝑿 𝟐 𝑾 𝟐)
𝟎 ≤ η ≤ 𝟏𝒏 = 𝟎, 𝟏, 𝟐, …
𝒅 𝒏 =
+𝟏, 𝒙 𝒏 𝒃𝒆𝒍𝒐𝒏𝒈𝒔 𝒕𝒐 𝑪𝟏
−𝟏, 𝒙 𝒏 𝒃𝒆𝒍𝒐𝒏𝒈𝒔 𝒕𝒐 𝑪𝟐
𝑿(𝒏)=(𝑿 𝟎, 𝑿 𝟏,𝑿 𝟐)
W(𝒏)=(𝑾 𝟎, 𝑾 𝟏,𝑾 𝟐)
𝑭 𝟐𝑭 𝟏
16.8121
C1
15.2114
9.4210
C2
8.1195
F(s)s sgn
=+1𝑿 𝟎
W0
𝑿 𝟏
𝑿 𝟐
𝑭 𝟏
𝑭 𝟐
𝑾 𝟏
𝑾 𝟐
C1/C2
𝒀𝒋
Neural Networks Training Steps
Weights Initialization
Inputs Application
Sum of Inputs-Weights Products
Activation Function Response Calculation
Weights Adaptation
Back to Step 2
1
2
3
4
5
6
Regarding 5th Step: Weights Adaptation
• If the predicted output Y is not the same as the desired output d,
then weights are to be adapted according to the following equation:
𝑾 𝒏 + 𝟏 = 𝑾 𝒏 + η 𝒅 𝒏 − 𝒀 𝒏 𝑿(𝒏)
Where
𝑾 𝒏 = [𝒃 𝒏 , 𝑾 𝟏(𝒏), 𝑾 𝟐(𝒏), 𝑾 𝟑(𝒏), … , 𝑾 𝒎(𝒏)]
Neural Networks
Training Example
Step n=0
• In each step in the solution, the parameters of the neural network
must be known.
• Parameters of step n=0:
η = .01
𝑋 𝑛 = 𝑋 0 = +1, 121, 16.8
𝑊 𝑛 = 𝑊 0 = −1230, −30, 300
𝑑 𝑛 = 𝑑 0 = +1
𝑭 𝟐𝑭 𝟏
16.8121
C1 = +1
15.2114
9.4210
C2 = -1
8.1195
Neural Networks
Training Example
Step n=0
F(s)s sgn
=+1𝑿 𝟎
-1230
𝑿 𝟏
𝑿 𝟐
121
𝟏𝟔. 𝟖
-30
300
𝑭 𝟐𝑭 𝟏
16.8121
C1 = +1
15.2114
9.4210
C2 = -1
8.1195
C1/C2
𝒀(𝒏)
Neural Networks
Training Example
Step n=0 - SOP
s=(𝑿 𝟎 𝑾 𝟎+𝑿 𝟏 𝑾 𝟏+𝑿 𝟐 𝑾 𝟐)
=+1*-1230+121*-30+16.8*300
=180
𝑭 𝟐𝑭 𝟏
16.8121
C1 = +1
15.2114
9.4210
C2 = -1
8.1195
F(s)s sgn
=+1𝑿 𝟎
-1230
𝑿 𝟏
𝑿 𝟐
121
𝟏𝟔. 𝟖
-30
300
C1/C2
𝒀(𝒏)
Neural Networks
Training Example
Step n=0 - Output
𝒀 𝒏 = 𝒀 𝟎
= 𝑺𝑮𝑵 𝒔
= 𝑺𝑮𝑵 𝟏𝟖𝟎
= +𝟏
𝒔𝒈𝒏 𝒔 =
+𝟏, 𝒔 ≥ 𝟎
−𝟏, 𝒔 < 𝟎
𝑭 𝟐𝑭 𝟏
16.8121
C1 = +1
15.2114
9.4210
C2 = -1
8.1195
F(s)s sgn
=+1𝑿 𝟎
-1230
𝑿 𝟏
𝑿 𝟐
121
𝟏𝟔. 𝟖
-30
300
C1/C2
𝒀(𝒏)
Neural Networks
Training Example
Step n=0 - Output
𝑭 𝟐𝑭 𝟏
16.8121
C1 = +1
15.2114
9.4210
C2 = -1
8.1195
F(s)s sgn
=+1𝑿 𝟎
-1230
𝑿 𝟏
𝑿 𝟐
121
𝟏𝟔. 𝟖
-30
300
𝒀 𝒏 = 𝒀 𝟎
= 𝑺𝑮𝑵 𝒔
= 𝑺𝑮𝑵 𝟏𝟖𝟎
= +𝟏
C1
+1
Neural Networks
Training Example
Step n=0
Predicted Vs. Desired
𝒀 𝒏 = 𝒀 𝟎 = +𝟏
𝐝 𝒏 = 𝒅 𝟎 = +𝟏
∵ 𝒀 𝒏 = 𝒅 𝒏
∴ Weights are Correct.
No Adaptation
𝑭 𝟐𝑭 𝟏
16.8121
C1 = +1
15.2114
9.4210
C2 = -1
8.1195
F(s)s sgn
=+1𝑿 𝟎
-1230
𝑿 𝟏
𝑿 𝟐
121
𝟏𝟔. 𝟖
-30
300
C1
+1
Neural Networks
Training Example
Step n=1
• In each step in the solution, the parameters of the neural network
must be known.
• Parameters of step n=1:
η = .01
𝑋 𝑛 = 𝑋 1 = +1, 121, 16.8
𝑊 𝑛 = 𝑊 1 = 𝑊 0 = −1230, −30, 300
𝑑 𝑛 = 𝑑 1 = +1
𝑭 𝟐𝑭 𝟏
16.8121
C1 = +1
15.2114
9.4210
C2 = -1
8.1195
Neural Networks
Training Example
Step n=1
F(s)s sgn
=+1𝑿 𝟎
-1230
𝑿 𝟏
𝑿 𝟐
114
𝟏𝟓. 𝟐
-30
300
𝑭 𝟐𝑭 𝟏
16.8121
C1 = +1
15.2114
9.4210
C2 = -1
8.1195
C1/C2
𝒀(𝒏)
Neural Networks
Training Example
Step n=1 - SOP
s=(𝑿 𝟎 𝑾 𝟎+𝑿 𝟏 𝑾 𝟏+𝑿 𝟐 𝑾 𝟐)
=+1*-1230+114*-30+15.2*300
=-90
𝑭 𝟐𝑭 𝟏
16.8121
C1 = +1
15.2114
9.4210
C2 = -1
8.1195
F(s)s sgn
=+1𝑿 𝟎
-1230
𝑿 𝟏
𝑿 𝟐
114
𝟏𝟓. 𝟐
-30
300
C1/C2
𝒀(𝒏)
Neural Networks
Training Example
Step n=1 - Output
𝒀 𝒏 = 𝒀 𝟏
= 𝑺𝑮𝑵 𝒔
= 𝑺𝑮𝑵 −𝟗𝟎
= −𝟏
𝒔𝒈𝒏 𝒔 =
+𝟏, 𝒔 ≥ 𝟎
−𝟏, 𝒔 < 𝟎
𝑭 𝟐𝑭 𝟏
16.8121
C1 = +1
15.2114
9.4210
C2 = -1
8.1195
F(s)s sgn
=+1𝑿 𝟎
-1230
𝑿 𝟏
𝑿 𝟐
114
𝟏𝟓. 𝟐
-30
300
C1/C2
𝒀(𝒏)
Neural Networks
Training Example
Step n=1 - Output
𝑭 𝟐𝑭 𝟏
16.8121
C1 = +1
15.2114
9.4210
C2 = -1
8.1195
F(s)s sgn
=+1𝑿 𝟎
-1230
𝑿 𝟏
𝑿 𝟐
114
𝟏𝟓. 𝟐
-30
300
𝒀 𝒏 = 𝒀 𝟏
= 𝑺𝑮𝑵 𝒔
= 𝑺𝑮𝑵 −𝟗𝟎
= −𝟏
C2
-1
Neural Networks
Training Example
Step n=1
Predicted Vs. Desired
𝒀 𝒏 = 𝒀 𝟏 = −𝟏
𝐝 𝒏 = 𝒅 𝟏 = +𝟏
∵ 𝒀 𝒏 ≠ 𝒅 𝒏
∴ Weights are Incorrect.
Adaptation Required.
𝑭 𝟐𝑭 𝟏
16.8121
C1 = +1
15.2114
9.4210
C2 = -1
8.1195
F(s)s sgn
=+1𝑿 𝟎
-1230
𝑿 𝟏
𝑿 𝟐
114
𝟏𝟓. 𝟐
-30
300
C2
-1
Weights Adaptation
• According to
𝑾 𝒏 + 𝟏 = 𝑾 𝒏 + η 𝒅 𝒏 − 𝒀 𝒏 𝑿(𝒏)
• Where n = 1
𝑾 𝟏 + 𝟏 = 𝑾 𝟏 + η 𝒅 𝟏 − 𝒀 𝟏 𝑿(𝟏)
𝑾 𝟐 = −𝟏𝟐𝟑𝟎, 𝟑𝟎, 𝟑𝟎𝟎 + .01 +𝟏 − (−𝟏) +𝟏, 𝟏𝟏𝟒, 𝟏𝟓. 𝟐
𝑾 𝟐 = −𝟏𝟐𝟑𝟎, 𝟑𝟎, 𝟑𝟎𝟎 + .01 +𝟐 +𝟏, 𝟏𝟏𝟒, 𝟏𝟓. 𝟐
𝑾 𝟐 = −𝟏𝟐𝟑𝟎, 𝟑𝟎, 𝟑𝟎𝟎 + .0 𝟐 +𝟏, 𝟏𝟏𝟒, 𝟏𝟓. 𝟐
𝑾 𝟐 = −𝟏𝟐𝟑𝟎, 𝟑𝟎, 𝟑𝟎𝟎 + +. 𝟎𝟐, 𝟐. 𝟐𝟖, . 𝟑𝟎𝟒
𝑾 𝟐 = −𝟏𝟐𝟐𝟗. 𝟎𝟖, −𝟐𝟕. 𝟕𝟐, 𝟑𝟎𝟎. 𝟑𝟎𝟒
Neural Networks
Training Example
Step n=2
• In each step in the solution, the parameters of the neural network
must be known.
• Parameters of step n=2:
η = .01
𝑋 𝑛 = 𝑋 2 = +1, 210, 9.4
𝑊 𝑛 = 𝑊 2 = −1229.08, −27.72, 300.304
𝑑 𝑛 = 𝑑 2 = −1
𝑭 𝟐𝑭 𝟏
16.8121
C1 = +1
15.2114
9.4210
C2 = -1
8.1195
Neural Networks
Training Example
Step n=2
𝑭 𝟐𝑭 𝟏
16.8121
C1 = +1
15.2114
9.4210
C2 = -1
8.1195
F(s)s sgn
=+1𝑿 𝟎 -
1229.
08
𝑿 𝟏
𝑿 𝟐
210
𝟗. 𝟒
-27.72
300.3
04
C1/C2
𝒀(𝒏)
Neural Networks
Training Example
Step n=2 - SOP
s=(𝑿 𝟎 𝑾 𝟎+𝑿 𝟏 𝑾 𝟏+𝑿 𝟐 𝑾 𝟐)
=+1*-1229.08+210*-
27.72+9.4*300.304
=-4227.4224
𝑭 𝟐𝑭 𝟏
16.8121
C1 = +1
15.2114
9.4210
C2 = -1
8.1195
F(s)s sgn
=+1𝑿 𝟎 -
1229.
08
𝑿 𝟏
𝑿 𝟐
210
𝟗. 𝟒
-27.72
300.3
04
C1/C2
𝒀(𝒏)
Neural Networks
Training Example
Step n=2 - Output
𝒀 𝒏 = 𝒀 𝟐
= 𝑺𝑮𝑵 𝒔
= 𝑺𝑮𝑵 −4227.4224
= +𝟏
𝒔𝒈𝒏 𝒔 =
+𝟏, 𝒔 ≥ 𝟎
−𝟏, 𝒔 < 𝟎
𝑭 𝟐𝑭 𝟏
16.8121
C1 = +1
15.2114
9.4210
C2 = -1
8.1195
F(s)s sgn
=+1𝑿 𝟎 -
1229.
08
𝑿 𝟏
𝑿 𝟐
210
𝟗. 𝟒
-27.72
300.3
04
C1/C2
𝒀(𝒏)
Neural Networks
Training Example
Step n=2 - Output
𝒀 𝒏 = 𝒀 𝟐
= 𝑺𝑮𝑵 𝒔
= 𝑺𝑮𝑵 −4227.4224
= −𝟏
𝑭 𝟐𝑭 𝟏
16.8121
C1 = +1
15.2114
9.4210
C2 = -1
8.1195
F(s)s sgn
=+1𝑿 𝟎 -
1229.
08
𝑿 𝟏
𝑿 𝟐
210
𝟗. 𝟒
-27.72
300.3
04
C2
−𝟏
Neural Networks
Training Example
Step n=2
Predicted Vs. Desired
𝒀 𝒏 = 𝒀 𝟐 = −𝟏
𝐝 𝒏 = 𝒅 𝟐 = −𝟏
∵ 𝒀 𝒏 = 𝒅 𝒏
∴ Weights are Correct.
No Adaptation
𝑭 𝟐𝑭 𝟏
16.8121
C1 = +1
15.2114
9.4210
C2 = -1
8.1195
F(s)s sgn
=+1𝑿 𝟎 -
1229.
08
𝑿 𝟏
𝑿 𝟐
210
𝟗. 𝟒
-27.72
300.3
04
C2
−𝟏
Neural Networks
Training Example
Step n=3
• In each step in the solution, the parameters of the neural network
must be known.
• Parameters of step n=3:
η = .01
𝑋 𝑛 = 𝑋 3 = +1, 210, 9.4
𝑊 𝑛 = 𝑊 3 = 𝑊 2 = −1229.08, −27.72, 300.304
𝑑 𝑛 = 𝑑 3 = −1
𝑭 𝟐𝑭 𝟏
16.8121
C1 = +1
15.2114
9.4210
C2 = -1
8.1195
Neural Networks
Training Example
Step n=3
𝑭 𝟐𝑭 𝟏
16.8121
C1 = +1
15.2114
9.4210
C2 = -1
8.1195
F(s)s sgn
=+1𝑿 𝟎 -
1229.
08
𝑿 𝟏
𝑿 𝟐
195
𝟖. 𝟏
-27.72
300.3
04
C1/C2
𝒀(𝒏)
Neural Networks
Training Example
Step n=3 - SOP
s=(𝑿 𝟎 𝑾 𝟎+𝑿 𝟏 𝑾 𝟏+𝑿 𝟐 𝑾 𝟐)
=+1*-1229.08+195*-
27.72+8.1*300.304
=-4202.0176
𝑭 𝟐𝑭 𝟏
16.8121
C1 = +1
15.2114
9.4210
C2 = -1
8.1195
F(s)s sgn
=+1𝑿 𝟎 -
1229.
08
𝑿 𝟏
𝑿 𝟐
195
𝟖. 𝟏
-27.72
300.3
04
C1/C2
𝒀(𝒏)
Neural Networks
Training Example
Step n=3 - Output
𝒀 𝒏 = 𝒀 𝟑
= 𝑺𝑮𝑵 𝒔
= 𝑺𝑮𝑵 −4202.0176
= −𝟏
𝒔𝒈𝒏 𝒔 =
+𝟏, 𝒔 ≥ 𝟎
−𝟏, 𝒔 < 𝟎
𝑭 𝟐𝑭 𝟏
16.8121
C1 = +1
15.2114
9.4210
C2 = -1
8.1195
F(s)s sgn
=+1𝑿 𝟎 -
1229.
08
𝑿 𝟏
𝑿 𝟐
195
𝟖. 𝟏
-27.72
300.3
04
C1/C2
𝒀(𝒏)
Neural Networks
Training Example
Step n=3 - Output
𝒀 𝒏 = 𝒀 𝟑
= 𝑺𝑮𝑵 𝒔
= 𝑺𝑮𝑵 −4202.0176
= −𝟏
𝑭 𝟐𝑭 𝟏
16.8121
C1 = +1
15.2114
9.4210
C2 = -1
8.1195
F(s)s sgn
=+1𝑿 𝟎 -
1229.
08
𝑿 𝟏
𝑿 𝟐
195
𝟖. 𝟏
-27.72
300.3
04
C2
−𝟏
Neural Networks
Training Example
Step n=3
Predicted Vs. Desired
𝒀 𝒏 = 𝒀 𝟑 = −𝟏
𝐝 𝒏 = 𝒅 𝟑 = −𝟏
∵ 𝒀 𝒏 = 𝒅 𝒏
∴ Weights are Correct.
No Adaptation
𝑭 𝟐𝑭 𝟏
16.8121
C1 = +1
15.2114
9.4210
C2 = -1
8.1195
F(s)s sgn
=+1𝑿 𝟎 -
1229.
08
𝑿 𝟏
𝑿 𝟐
195
𝟖. 𝟏
-27.72
300.3
04
C1
−𝟏
Neural Networks
Training Example
Step n=4
• In each step in the solution, the parameters of the neural network
must be known.
• Parameters of step n=4:
η = .01
𝑋 𝑛 = 𝑋 4 = +1, 121, 16.8
𝑊 𝑛 = 𝑊 4 = 𝑊 3 = −1229.08, −27.72, 300.304
𝑑 𝑛 = 𝑑 4 = +1
𝑭 𝟐𝑭 𝟏
16.8121
C1 = +1
15.2114
9.4210
C2 = -1
8.1195
Neural Networks
Training Example
Step n=4
𝑭 𝟐𝑭 𝟏
16.8121
C1 = +1
15.2114
9.4210
C2 = -1
8.1195
F(s)s sgn
=+1𝑿 𝟎 -
1229.
08
𝑿 𝟏
𝑿 𝟐
121
𝟏𝟔. 𝟖
-27.72
300.3
04
C1/C2
𝒀(𝒏)
Neural Networks
Training Example
Step n=4 - SOP
s=(𝑿 𝟎 𝑾 𝟎+𝑿 𝟏 𝑾 𝟏+𝑿 𝟐 𝑾 𝟐)
=+1*-1229.08+121*-
27.72+16.8*300.304
=461.91
𝑭 𝟐𝑭 𝟏
16.8121
C1 = +1
15.2114
9.4210
C2 = -1
8.1195
F(s)s sgn
=+1𝑿 𝟎 -
1229.
08
𝑿 𝟏
𝑿 𝟐
121
𝟏𝟔. 𝟖
-27.72
300.3
04
C1/C2
𝒀(𝒏)
Neural Networks
Training Example
Step n=4 - Output
𝒀 𝒏 = 𝒀 𝟒
= 𝑺𝑮𝑵 𝒔
= 𝑺𝑮𝑵 461.91
= +𝟏
𝒔𝒈𝒏 𝒔 =
+𝟏, 𝒔 ≥ 𝟎
−𝟏, 𝒔 < 𝟎
𝑭 𝟐𝑭 𝟏
16.8121
C1 = +1
15.2114
9.4210
C2 = -1
8.1195
F(s)s sgn
=+1𝑿 𝟎 -
1229.
08
𝑿 𝟏
𝑿 𝟐
121
𝟏𝟔. 𝟖
-27.72
300.3
04
C1/C2
𝒀(𝒏)
Neural Networks
Training Example
Step n=4 - Output
𝒀 𝒏 = 𝒀 𝟒
= 𝑺𝑮𝑵 𝒔
= 𝑺𝑮𝑵 461.91
= +𝟏
𝑭 𝟐𝑭 𝟏
16.8121
C1 = +1
15.2114
9.4210
C2 = -1
8.1195
F(s)s sgn
=+1𝑿 𝟎 -
1229.
08
𝑿 𝟏
𝑿 𝟐
121
𝟏𝟔. 𝟖
-27.72
300.3
04
C1
+𝟏
Neural Networks
Training Example
Step n=4
Predicted Vs. Desired
𝒀 𝒏 = 𝒀 𝟒 = +𝟏
𝐝 𝒏 = 𝒅 𝟒 = +𝟏
∵ 𝒀 𝒏 = 𝒅 𝒏
∴ Weights are Correct.
No Adaptation
𝑭 𝟐𝑭 𝟏
16.8121
C1 = +1
15.2114
9.4210
C2 = -1
8.1195
F(s)s sgn
=+1𝑿 𝟎 -
1229.
08
𝑿 𝟏
𝑿 𝟐
121
𝟏𝟔. 𝟖
-27.72
300.3
04
C1
+𝟏
Neural Networks
Training Example
Step n=5
• In each step in the solution, the parameters of the neural network
must be known.
• Parameters of step n=5:
η = .01
𝑋 𝑛 = 𝑋 5 = +1, 114, 15.2
𝑊 𝑛 = 𝑊 5 = 𝑊 4 = −1229.08, −27.72, 300.304
𝑑 𝑛 = 𝑑 5 = +1
𝑭 𝟐𝑭 𝟏
16.8121
C1 = +1
15.2114
9.4210
C2 = -1
8.1195
Neural Networks
Training Example
Step n=5
𝑭 𝟐𝑭 𝟏
16.8121
C1 = +1
15.2114
9.4210
C2 = -1
8.1195
F(s)s sgn
=+1𝑿 𝟎 -
1229.
08
𝑿 𝟏
𝑿 𝟐
114
𝟏𝟓. 𝟐
-27.72
300.3
04
C1/C2
𝒀(𝒏)
Neural Networks
Training Example
Step n=5 - SOP
s=(𝑿 𝟎 𝑾 𝟎+𝑿 𝟏 𝑾 𝟏+𝑿 𝟐 𝑾 𝟐)
=+1*-1229.08+114*-
27.72+15.2*300.304
= 1𝟕𝟓. 𝟒𝟔
𝑭 𝟐𝑭 𝟏
16.8121
C1 = +1
15.2114
9.4210
C2 = -1
8.1195
F(s)s sgn
=+1𝑿 𝟎 -
1229.
08
𝑿 𝟏
𝑿 𝟐
114
𝟏𝟓. 𝟐
-27.72
300.3
04
C1/C2
𝒀(𝒏)
Neural Networks
Training Example
Step n=5 - Output
𝒀 𝒏 = 𝒀 𝟓
= 𝑺𝑮𝑵 𝒔
= 𝑺𝑮𝑵 1𝟕𝟓. 𝟒𝟔
= +𝟏
𝒔𝒈𝒏 𝒔 =
+𝟏, 𝒔 ≥ 𝟎
−𝟏, 𝒔 < 𝟎
𝑭 𝟐𝑭 𝟏
16.8121
C1 = +1
15.2114
9.4210
C2 = -1
8.1195
F(s)s sgn
=+1𝑿 𝟎 -
1229.
08
𝑿 𝟏
𝑿 𝟐
114
𝟏𝟓. 𝟐
-27.72
300.3
04
C1/C2
𝒀(𝒏)
Neural Networks
Training Example
Step n=5 - Output
𝒀 𝒏 = 𝒀 𝟓
= 𝑺𝑮𝑵 𝒔
= 𝑺𝑮𝑵 1𝟕𝟓. 𝟒𝟔
= +𝟏
𝑭 𝟐𝑭 𝟏
16.8121
C1 = +1
15.2114
9.4210
C2 = -1
8.1195
F(s)s sgn
=+1𝑿 𝟎 -
1229.
08
𝑿 𝟏
𝑿 𝟐
114
𝟏𝟓. 𝟐
-27.72
300.3
04
C1
+𝟏
Neural Networks
Training Example
Step n=5
Predicted Vs. Desired
𝒀 𝒏 = 𝒀 𝟒 = +𝟏
𝐝 𝒏 = 𝒅 𝟒 = +𝟏
∵ 𝒀 𝒏 = 𝒅 𝒏
∴ Weights are Correct.
No Adaptation
𝑭 𝟐𝑭 𝟏
16.8121
C1 = +1
15.2114
9.4210
C2 = -1
8.1195
F(s)s sgn
=+1𝑿 𝟎 -
1229.
08
𝑿 𝟏
𝑿 𝟐
114
𝟏𝟓. 𝟐
-27.72
300.3
04
C1
+𝟏
Correct Weights
• After testing the weights across all samples and results were correct
then we can conclude that current weights are correct ones for
training the neural network.
• After training phase we come to testing the neural network.
• What is the class of the unknown color of values of F1=140 and
F2=17.9?
Testing Trained Neural Network
(F1, F2) = (140, 17.9)
Trained Neural Networks Parameters
𝑊 = −1229.08, −27.72, 300.304
Testing Trained Neural Network
(F1, F2) = (140, 17.9)
SOP
F(s)s sgn
=+1𝑿 𝟎 -
1229.
08
𝑿 𝟏
𝑿 𝟐
140
𝟏𝟕. 𝟗
-27.72
300.3
04
C1/C2
𝒀(𝒏)
s=(𝑿 𝟎 𝑾 𝟎+𝑿 𝟏 𝑾 𝟏+𝑿 𝟐 𝑾 𝟐)
=+1*-1229.08+140*-
27.72+17.9*300.304
= 𝟐𝟔𝟓
Testing Trained Neural Network
(F1, F2) = (140, 17.9)
Output
𝒀
= 𝑺𝑮𝑵 𝒔
= 𝑺𝑮𝑵 𝟐𝟔𝟓
= +𝟏
𝒔𝒈𝒏 𝒔 =
+𝟏, 𝒔 ≥ 𝟎
−𝟏, 𝒔 < 𝟎
F(s)s sgn
=+1𝑿 𝟎 -
1229.
08
𝑿 𝟏
𝑿 𝟐
140
𝟏𝟕. 𝟗
-27.72
300.3
04
C1/C2
𝒀(𝒏)
Testing Trained Neural Network
(F1, F2) = (140, 17.9)
Output
𝒀
= 𝑺𝑮𝑵 𝒔
= 𝑺𝑮𝑵 𝟐𝟔𝟓
= +𝟏
F(s)s sgn
=+1𝑿 𝟎 -
1229.
08
𝑿 𝟏
𝑿 𝟐
140
𝟏𝟕. 𝟗
-27.72
300.3
04
C1
+1

More Related Content

What's hot

오토인코더의 모든 것
오토인코더의 모든 것오토인코더의 모든 것
오토인코더의 모든 것NAVER Engineering
 
Batch normalization presentation
Batch normalization presentationBatch normalization presentation
Batch normalization presentationOwin Will
 
Photo-realistic Single Image Super-resolution using a Generative Adversarial ...
Photo-realistic Single Image Super-resolution using a Generative Adversarial ...Photo-realistic Single Image Super-resolution using a Generative Adversarial ...
Photo-realistic Single Image Super-resolution using a Generative Adversarial ...Hansol Kang
 
GANs and Applications
GANs and ApplicationsGANs and Applications
GANs and ApplicationsHoang Nguyen
 
Generative adversarial networks
Generative adversarial networksGenerative adversarial networks
Generative adversarial networksYunjey Choi
 
Tutorial on convolutional neural networks
Tutorial on convolutional neural networksTutorial on convolutional neural networks
Tutorial on convolutional neural networksHojin Yang
 
Actor critic algorithm
Actor critic algorithmActor critic algorithm
Actor critic algorithmJie-Han Chen
 
Hands on machine learning with scikit-learn and tensor flow by ahmed yousry
Hands on machine learning with scikit-learn and tensor flow by ahmed yousryHands on machine learning with scikit-learn and tensor flow by ahmed yousry
Hands on machine learning with scikit-learn and tensor flow by ahmed yousryAhmed Yousry
 
Generative Adversarial Networks (GANs) - Ian Goodfellow, OpenAI
Generative Adversarial Networks (GANs) - Ian Goodfellow, OpenAIGenerative Adversarial Networks (GANs) - Ian Goodfellow, OpenAI
Generative Adversarial Networks (GANs) - Ian Goodfellow, OpenAIWithTheBest
 
Recurrent Neural Networks (RNN) | RNN LSTM | Deep Learning Tutorial | Tensorf...
Recurrent Neural Networks (RNN) | RNN LSTM | Deep Learning Tutorial | Tensorf...Recurrent Neural Networks (RNN) | RNN LSTM | Deep Learning Tutorial | Tensorf...
Recurrent Neural Networks (RNN) | RNN LSTM | Deep Learning Tutorial | Tensorf...Edureka!
 
CNN Attention Networks
CNN Attention NetworksCNN Attention Networks
CNN Attention NetworksTaeoh Kim
 
Flow based generative models
Flow based generative modelsFlow based generative models
Flow based generative models수철 박
 
Dimensionality Reduction
Dimensionality ReductionDimensionality Reduction
Dimensionality Reductionmrizwan969
 
Machine Learning - Dataset Preparation
Machine Learning - Dataset PreparationMachine Learning - Dataset Preparation
Machine Learning - Dataset PreparationAndrew Ferlitsch
 
Matrix and Tensor Tools for Computer Vision
Matrix and Tensor Tools for Computer VisionMatrix and Tensor Tools for Computer Vision
Matrix and Tensor Tools for Computer VisionActiveEon
 
Introduction to deep learning
Introduction to deep learningIntroduction to deep learning
Introduction to deep learningAmr Rashed
 
MobileNet - PR044
MobileNet - PR044MobileNet - PR044
MobileNet - PR044Jinwon Lee
 
Introduction to Recurrent Neural Network
Introduction to Recurrent Neural NetworkIntroduction to Recurrent Neural Network
Introduction to Recurrent Neural NetworkKnoldus Inc.
 
Deep Learning Tutorial
Deep Learning TutorialDeep Learning Tutorial
Deep Learning TutorialAmr Rashed
 

What's hot (20)

오토인코더의 모든 것
오토인코더의 모든 것오토인코더의 모든 것
오토인코더의 모든 것
 
Batch normalization presentation
Batch normalization presentationBatch normalization presentation
Batch normalization presentation
 
Photo-realistic Single Image Super-resolution using a Generative Adversarial ...
Photo-realistic Single Image Super-resolution using a Generative Adversarial ...Photo-realistic Single Image Super-resolution using a Generative Adversarial ...
Photo-realistic Single Image Super-resolution using a Generative Adversarial ...
 
GANs and Applications
GANs and ApplicationsGANs and Applications
GANs and Applications
 
Generative adversarial networks
Generative adversarial networksGenerative adversarial networks
Generative adversarial networks
 
Tutorial on convolutional neural networks
Tutorial on convolutional neural networksTutorial on convolutional neural networks
Tutorial on convolutional neural networks
 
Actor critic algorithm
Actor critic algorithmActor critic algorithm
Actor critic algorithm
 
Hands on machine learning with scikit-learn and tensor flow by ahmed yousry
Hands on machine learning with scikit-learn and tensor flow by ahmed yousryHands on machine learning with scikit-learn and tensor flow by ahmed yousry
Hands on machine learning with scikit-learn and tensor flow by ahmed yousry
 
Generative Adversarial Networks (GANs) - Ian Goodfellow, OpenAI
Generative Adversarial Networks (GANs) - Ian Goodfellow, OpenAIGenerative Adversarial Networks (GANs) - Ian Goodfellow, OpenAI
Generative Adversarial Networks (GANs) - Ian Goodfellow, OpenAI
 
Recurrent Neural Networks (RNN) | RNN LSTM | Deep Learning Tutorial | Tensorf...
Recurrent Neural Networks (RNN) | RNN LSTM | Deep Learning Tutorial | Tensorf...Recurrent Neural Networks (RNN) | RNN LSTM | Deep Learning Tutorial | Tensorf...
Recurrent Neural Networks (RNN) | RNN LSTM | Deep Learning Tutorial | Tensorf...
 
CNN Attention Networks
CNN Attention NetworksCNN Attention Networks
CNN Attention Networks
 
Flow based generative models
Flow based generative modelsFlow based generative models
Flow based generative models
 
Dimensionality Reduction
Dimensionality ReductionDimensionality Reduction
Dimensionality Reduction
 
Machine Learning - Dataset Preparation
Machine Learning - Dataset PreparationMachine Learning - Dataset Preparation
Machine Learning - Dataset Preparation
 
Matrix and Tensor Tools for Computer Vision
Matrix and Tensor Tools for Computer VisionMatrix and Tensor Tools for Computer Vision
Matrix and Tensor Tools for Computer Vision
 
Introduction to deep learning
Introduction to deep learningIntroduction to deep learning
Introduction to deep learning
 
LeNet-5
LeNet-5LeNet-5
LeNet-5
 
MobileNet - PR044
MobileNet - PR044MobileNet - PR044
MobileNet - PR044
 
Introduction to Recurrent Neural Network
Introduction to Recurrent Neural NetworkIntroduction to Recurrent Neural Network
Introduction to Recurrent Neural Network
 
Deep Learning Tutorial
Deep Learning TutorialDeep Learning Tutorial
Deep Learning Tutorial
 

Similar to Introduction to Artificial Neural Networks (ANNs) - Step-by-Step Training & Testing Example 2

Lec16 Intro to Computer Engineering by Hsien-Hsin Sean Lee Georgia Tech -- Fi...
Lec16 Intro to Computer Engineering by Hsien-Hsin Sean Lee Georgia Tech -- Fi...Lec16 Intro to Computer Engineering by Hsien-Hsin Sean Lee Georgia Tech -- Fi...
Lec16 Intro to Computer Engineering by Hsien-Hsin Sean Lee Georgia Tech -- Fi...Hsien-Hsin Sean Lee, Ph.D.
 
20 ms-me-amd-06 (simple linear regression)
20 ms-me-amd-06 (simple linear regression)20 ms-me-amd-06 (simple linear regression)
20 ms-me-amd-06 (simple linear regression)HassanShah124
 
Proyecto final curso – Electrónica Digital I
Proyecto final curso – Electrónica Digital IProyecto final curso – Electrónica Digital I
Proyecto final curso – Electrónica Digital IDaniel A. Lopez Ch.
 
Pe 3231 week 15 18 plc
Pe 3231 week 15 18  plcPe 3231 week 15 18  plc
Pe 3231 week 15 18 plcCharlton Inao
 
Analysis sequential circuits
Analysis sequential circuitsAnalysis sequential circuits
Analysis sequential circuitsG Subramaniamg
 
How much time will be used for driving
How much time will be used for drivingHow much time will be used for driving
How much time will be used for drivingRuo Yang
 
Electrocraft drive selection_guide_catalog
Electrocraft drive selection_guide_catalogElectrocraft drive selection_guide_catalog
Electrocraft drive selection_guide_catalogElectromate
 
Ee2365 nol part 2
Ee2365 nol part 2Ee2365 nol part 2
Ee2365 nol part 2Arun Kumaar
 
Filter design and simulation
Filter design and simulationFilter design and simulation
Filter design and simulationSandesh Agrawal
 
Measuring the Precision of Multi-perspective Process Models
Measuring the Precision of Multi-perspective Process ModelsMeasuring the Precision of Multi-perspective Process Models
Measuring the Precision of Multi-perspective Process ModelsFelix Mannhardt
 
Top schools in faridabad
Top schools in faridabadTop schools in faridabad
Top schools in faridabadEdhole.com
 

Similar to Introduction to Artificial Neural Networks (ANNs) - Step-by-Step Training & Testing Example 2 (20)

On the Step Explosion Problem
On the Step Explosion ProblemOn the Step Explosion Problem
On the Step Explosion Problem
 
Lec16 Intro to Computer Engineering by Hsien-Hsin Sean Lee Georgia Tech -- Fi...
Lec16 Intro to Computer Engineering by Hsien-Hsin Sean Lee Georgia Tech -- Fi...Lec16 Intro to Computer Engineering by Hsien-Hsin Sean Lee Georgia Tech -- Fi...
Lec16 Intro to Computer Engineering by Hsien-Hsin Sean Lee Georgia Tech -- Fi...
 
9920Lec12 FSM.ppt
9920Lec12 FSM.ppt9920Lec12 FSM.ppt
9920Lec12 FSM.ppt
 
Final Project
Final ProjectFinal Project
Final Project
 
20 ms-me-amd-06 (simple linear regression)
20 ms-me-amd-06 (simple linear regression)20 ms-me-amd-06 (simple linear regression)
20 ms-me-amd-06 (simple linear regression)
 
04 comb ex
04 comb ex04 comb ex
04 comb ex
 
Proyecto final curso – Electrónica Digital I
Proyecto final curso – Electrónica Digital IProyecto final curso – Electrónica Digital I
Proyecto final curso – Electrónica Digital I
 
Pe 3231 week 15 18 plc
Pe 3231 week 15 18  plcPe 3231 week 15 18  plc
Pe 3231 week 15 18 plc
 
Cash flow
Cash flowCash flow
Cash flow
 
Analysis sequential circuits
Analysis sequential circuitsAnalysis sequential circuits
Analysis sequential circuits
 
Lec17-Registers.ppt
Lec17-Registers.pptLec17-Registers.ppt
Lec17-Registers.ppt
 
To excel or not?
To excel or not?To excel or not?
To excel or not?
 
DC MACHINE WINDINGS
DC MACHINE WINDINGSDC MACHINE WINDINGS
DC MACHINE WINDINGS
 
How much time will be used for driving
How much time will be used for drivingHow much time will be used for driving
How much time will be used for driving
 
Mecanismos
MecanismosMecanismos
Mecanismos
 
Electrocraft drive selection_guide_catalog
Electrocraft drive selection_guide_catalogElectrocraft drive selection_guide_catalog
Electrocraft drive selection_guide_catalog
 
Ee2365 nol part 2
Ee2365 nol part 2Ee2365 nol part 2
Ee2365 nol part 2
 
Filter design and simulation
Filter design and simulationFilter design and simulation
Filter design and simulation
 
Measuring the Precision of Multi-perspective Process Models
Measuring the Precision of Multi-perspective Process ModelsMeasuring the Precision of Multi-perspective Process Models
Measuring the Precision of Multi-perspective Process Models
 
Top schools in faridabad
Top schools in faridabadTop schools in faridabad
Top schools in faridabad
 

More from Ahmed Gad

ICEIT'20 Cython for Speeding-up Genetic Algorithm
ICEIT'20 Cython for Speeding-up Genetic AlgorithmICEIT'20 Cython for Speeding-up Genetic Algorithm
ICEIT'20 Cython for Speeding-up Genetic AlgorithmAhmed Gad
 
NumPyCNNAndroid: A Library for Straightforward Implementation of Convolutiona...
NumPyCNNAndroid: A Library for Straightforward Implementation of Convolutiona...NumPyCNNAndroid: A Library for Straightforward Implementation of Convolutiona...
NumPyCNNAndroid: A Library for Straightforward Implementation of Convolutiona...Ahmed Gad
 
Python for Computer Vision - Revision 2nd Edition
Python for Computer Vision - Revision 2nd EditionPython for Computer Vision - Revision 2nd Edition
Python for Computer Vision - Revision 2nd EditionAhmed Gad
 
Multi-Objective Optimization using Non-Dominated Sorting Genetic Algorithm wi...
Multi-Objective Optimization using Non-Dominated Sorting Genetic Algorithm wi...Multi-Objective Optimization using Non-Dominated Sorting Genetic Algorithm wi...
Multi-Objective Optimization using Non-Dominated Sorting Genetic Algorithm wi...Ahmed Gad
 
M.Sc. Thesis - Automatic People Counting in Crowded Scenes
M.Sc. Thesis - Automatic People Counting in Crowded ScenesM.Sc. Thesis - Automatic People Counting in Crowded Scenes
M.Sc. Thesis - Automatic People Counting in Crowded ScenesAhmed Gad
 
Derivation of Convolutional Neural Network from Fully Connected Network Step-...
Derivation of Convolutional Neural Network from Fully Connected Network Step-...Derivation of Convolutional Neural Network from Fully Connected Network Step-...
Derivation of Convolutional Neural Network from Fully Connected Network Step-...Ahmed Gad
 
Introduction to Optimization with Genetic Algorithm (GA)
Introduction to Optimization with Genetic Algorithm (GA)Introduction to Optimization with Genetic Algorithm (GA)
Introduction to Optimization with Genetic Algorithm (GA)Ahmed Gad
 
Derivation of Convolutional Neural Network (ConvNet) from Fully Connected Net...
Derivation of Convolutional Neural Network (ConvNet) from Fully Connected Net...Derivation of Convolutional Neural Network (ConvNet) from Fully Connected Net...
Derivation of Convolutional Neural Network (ConvNet) from Fully Connected Net...Ahmed Gad
 
Avoid Overfitting with Regularization
Avoid Overfitting with RegularizationAvoid Overfitting with Regularization
Avoid Overfitting with RegularizationAhmed Gad
 
Genetic Algorithm (GA) Optimization - Step-by-Step Example
Genetic Algorithm (GA) Optimization - Step-by-Step ExampleGenetic Algorithm (GA) Optimization - Step-by-Step Example
Genetic Algorithm (GA) Optimization - Step-by-Step ExampleAhmed Gad
 
ICCES 2017 - Crowd Density Estimation Method using Regression Analysis
ICCES 2017 - Crowd Density Estimation Method using Regression AnalysisICCES 2017 - Crowd Density Estimation Method using Regression Analysis
ICCES 2017 - Crowd Density Estimation Method using Regression AnalysisAhmed Gad
 
Backpropagation: Understanding How to Update ANNs Weights Step-by-Step
Backpropagation: Understanding How to Update ANNs Weights Step-by-StepBackpropagation: Understanding How to Update ANNs Weights Step-by-Step
Backpropagation: Understanding How to Update ANNs Weights Step-by-StepAhmed Gad
 
Computer Vision: Correlation, Convolution, and Gradient
Computer Vision: Correlation, Convolution, and GradientComputer Vision: Correlation, Convolution, and Gradient
Computer Vision: Correlation, Convolution, and GradientAhmed Gad
 
Python for Computer Vision - Revision
Python for Computer Vision - RevisionPython for Computer Vision - Revision
Python for Computer Vision - RevisionAhmed Gad
 
Anime Studio Pro 10 Tutorial as Part of Multimedia Course
Anime Studio Pro 10 Tutorial as Part of Multimedia CourseAnime Studio Pro 10 Tutorial as Part of Multimedia Course
Anime Studio Pro 10 Tutorial as Part of Multimedia CourseAhmed Gad
 
Brief Introduction to Deep Learning + Solving XOR using ANNs
Brief Introduction to Deep Learning + Solving XOR using ANNsBrief Introduction to Deep Learning + Solving XOR using ANNs
Brief Introduction to Deep Learning + Solving XOR using ANNsAhmed Gad
 
Operations in Digital Image Processing + Convolution by Example
Operations in Digital Image Processing + Convolution by ExampleOperations in Digital Image Processing + Convolution by Example
Operations in Digital Image Processing + Convolution by ExampleAhmed Gad
 
MATLAB Code + Description : Real-Time Object Motion Detection and Tracking
MATLAB Code + Description : Real-Time Object Motion Detection and TrackingMATLAB Code + Description : Real-Time Object Motion Detection and Tracking
MATLAB Code + Description : Real-Time Object Motion Detection and TrackingAhmed Gad
 
MATLAB Code + Description : Very Simple Automatic English Optical Character R...
MATLAB Code + Description : Very Simple Automatic English Optical Character R...MATLAB Code + Description : Very Simple Automatic English Optical Character R...
MATLAB Code + Description : Very Simple Automatic English Optical Character R...Ahmed Gad
 
Graduation Project - Face Login : A Robust Face Identification System for Sec...
Graduation Project - Face Login : A Robust Face Identification System for Sec...Graduation Project - Face Login : A Robust Face Identification System for Sec...
Graduation Project - Face Login : A Robust Face Identification System for Sec...Ahmed Gad
 

More from Ahmed Gad (20)

ICEIT'20 Cython for Speeding-up Genetic Algorithm
ICEIT'20 Cython for Speeding-up Genetic AlgorithmICEIT'20 Cython for Speeding-up Genetic Algorithm
ICEIT'20 Cython for Speeding-up Genetic Algorithm
 
NumPyCNNAndroid: A Library for Straightforward Implementation of Convolutiona...
NumPyCNNAndroid: A Library for Straightforward Implementation of Convolutiona...NumPyCNNAndroid: A Library for Straightforward Implementation of Convolutiona...
NumPyCNNAndroid: A Library for Straightforward Implementation of Convolutiona...
 
Python for Computer Vision - Revision 2nd Edition
Python for Computer Vision - Revision 2nd EditionPython for Computer Vision - Revision 2nd Edition
Python for Computer Vision - Revision 2nd Edition
 
Multi-Objective Optimization using Non-Dominated Sorting Genetic Algorithm wi...
Multi-Objective Optimization using Non-Dominated Sorting Genetic Algorithm wi...Multi-Objective Optimization using Non-Dominated Sorting Genetic Algorithm wi...
Multi-Objective Optimization using Non-Dominated Sorting Genetic Algorithm wi...
 
M.Sc. Thesis - Automatic People Counting in Crowded Scenes
M.Sc. Thesis - Automatic People Counting in Crowded ScenesM.Sc. Thesis - Automatic People Counting in Crowded Scenes
M.Sc. Thesis - Automatic People Counting in Crowded Scenes
 
Derivation of Convolutional Neural Network from Fully Connected Network Step-...
Derivation of Convolutional Neural Network from Fully Connected Network Step-...Derivation of Convolutional Neural Network from Fully Connected Network Step-...
Derivation of Convolutional Neural Network from Fully Connected Network Step-...
 
Introduction to Optimization with Genetic Algorithm (GA)
Introduction to Optimization with Genetic Algorithm (GA)Introduction to Optimization with Genetic Algorithm (GA)
Introduction to Optimization with Genetic Algorithm (GA)
 
Derivation of Convolutional Neural Network (ConvNet) from Fully Connected Net...
Derivation of Convolutional Neural Network (ConvNet) from Fully Connected Net...Derivation of Convolutional Neural Network (ConvNet) from Fully Connected Net...
Derivation of Convolutional Neural Network (ConvNet) from Fully Connected Net...
 
Avoid Overfitting with Regularization
Avoid Overfitting with RegularizationAvoid Overfitting with Regularization
Avoid Overfitting with Regularization
 
Genetic Algorithm (GA) Optimization - Step-by-Step Example
Genetic Algorithm (GA) Optimization - Step-by-Step ExampleGenetic Algorithm (GA) Optimization - Step-by-Step Example
Genetic Algorithm (GA) Optimization - Step-by-Step Example
 
ICCES 2017 - Crowd Density Estimation Method using Regression Analysis
ICCES 2017 - Crowd Density Estimation Method using Regression AnalysisICCES 2017 - Crowd Density Estimation Method using Regression Analysis
ICCES 2017 - Crowd Density Estimation Method using Regression Analysis
 
Backpropagation: Understanding How to Update ANNs Weights Step-by-Step
Backpropagation: Understanding How to Update ANNs Weights Step-by-StepBackpropagation: Understanding How to Update ANNs Weights Step-by-Step
Backpropagation: Understanding How to Update ANNs Weights Step-by-Step
 
Computer Vision: Correlation, Convolution, and Gradient
Computer Vision: Correlation, Convolution, and GradientComputer Vision: Correlation, Convolution, and Gradient
Computer Vision: Correlation, Convolution, and Gradient
 
Python for Computer Vision - Revision
Python for Computer Vision - RevisionPython for Computer Vision - Revision
Python for Computer Vision - Revision
 
Anime Studio Pro 10 Tutorial as Part of Multimedia Course
Anime Studio Pro 10 Tutorial as Part of Multimedia CourseAnime Studio Pro 10 Tutorial as Part of Multimedia Course
Anime Studio Pro 10 Tutorial as Part of Multimedia Course
 
Brief Introduction to Deep Learning + Solving XOR using ANNs
Brief Introduction to Deep Learning + Solving XOR using ANNsBrief Introduction to Deep Learning + Solving XOR using ANNs
Brief Introduction to Deep Learning + Solving XOR using ANNs
 
Operations in Digital Image Processing + Convolution by Example
Operations in Digital Image Processing + Convolution by ExampleOperations in Digital Image Processing + Convolution by Example
Operations in Digital Image Processing + Convolution by Example
 
MATLAB Code + Description : Real-Time Object Motion Detection and Tracking
MATLAB Code + Description : Real-Time Object Motion Detection and TrackingMATLAB Code + Description : Real-Time Object Motion Detection and Tracking
MATLAB Code + Description : Real-Time Object Motion Detection and Tracking
 
MATLAB Code + Description : Very Simple Automatic English Optical Character R...
MATLAB Code + Description : Very Simple Automatic English Optical Character R...MATLAB Code + Description : Very Simple Automatic English Optical Character R...
MATLAB Code + Description : Very Simple Automatic English Optical Character R...
 
Graduation Project - Face Login : A Robust Face Identification System for Sec...
Graduation Project - Face Login : A Robust Face Identification System for Sec...Graduation Project - Face Login : A Robust Face Identification System for Sec...
Graduation Project - Face Login : A Robust Face Identification System for Sec...
 

Recently uploaded

How to do quick user assign in kanban in Odoo 17 ERP
How to do quick user assign in kanban in Odoo 17 ERPHow to do quick user assign in kanban in Odoo 17 ERP
How to do quick user assign in kanban in Odoo 17 ERPCeline George
 
Inclusivity Essentials_ Creating Accessible Websites for Nonprofits .pdf
Inclusivity Essentials_ Creating Accessible Websites for Nonprofits .pdfInclusivity Essentials_ Creating Accessible Websites for Nonprofits .pdf
Inclusivity Essentials_ Creating Accessible Websites for Nonprofits .pdfTechSoup
 
Incoming and Outgoing Shipments in 3 STEPS Using Odoo 17
Incoming and Outgoing Shipments in 3 STEPS Using Odoo 17Incoming and Outgoing Shipments in 3 STEPS Using Odoo 17
Incoming and Outgoing Shipments in 3 STEPS Using Odoo 17Celine George
 
Daily Lesson Plan in Mathematics Quarter 4
Daily Lesson Plan in Mathematics Quarter 4Daily Lesson Plan in Mathematics Quarter 4
Daily Lesson Plan in Mathematics Quarter 4JOYLYNSAMANIEGO
 
Food processing presentation for bsc agriculture hons
Food processing presentation for bsc agriculture honsFood processing presentation for bsc agriculture hons
Food processing presentation for bsc agriculture honsManeerUddin
 
Influencing policy (training slides from Fast Track Impact)
Influencing policy (training slides from Fast Track Impact)Influencing policy (training slides from Fast Track Impact)
Influencing policy (training slides from Fast Track Impact)Mark Reed
 
Barangay Council for the Protection of Children (BCPC) Orientation.pptx
Barangay Council for the Protection of Children (BCPC) Orientation.pptxBarangay Council for the Protection of Children (BCPC) Orientation.pptx
Barangay Council for the Protection of Children (BCPC) Orientation.pptxCarlos105
 
Full Stack Web Development Course for Beginners
Full Stack Web Development Course  for BeginnersFull Stack Web Development Course  for Beginners
Full Stack Web Development Course for BeginnersSabitha Banu
 
Earth Day Presentation wow hello nice great
Earth Day Presentation wow hello nice greatEarth Day Presentation wow hello nice great
Earth Day Presentation wow hello nice greatYousafMalik24
 
Concurrency Control in Database Management system
Concurrency Control in Database Management systemConcurrency Control in Database Management system
Concurrency Control in Database Management systemChristalin Nelson
 
Karra SKD Conference Presentation Revised.pptx
Karra SKD Conference Presentation Revised.pptxKarra SKD Conference Presentation Revised.pptx
Karra SKD Conference Presentation Revised.pptxAshokKarra1
 
ROLES IN A STAGE PRODUCTION in arts.pptx
ROLES IN A STAGE PRODUCTION in arts.pptxROLES IN A STAGE PRODUCTION in arts.pptx
ROLES IN A STAGE PRODUCTION in arts.pptxVanesaIglesias10
 
Choosing the Right CBSE School A Comprehensive Guide for Parents
Choosing the Right CBSE School A Comprehensive Guide for ParentsChoosing the Right CBSE School A Comprehensive Guide for Parents
Choosing the Right CBSE School A Comprehensive Guide for Parentsnavabharathschool99
 
4.18.24 Movement Legacies, Reflection, and Review.pptx
4.18.24 Movement Legacies, Reflection, and Review.pptx4.18.24 Movement Legacies, Reflection, and Review.pptx
4.18.24 Movement Legacies, Reflection, and Review.pptxmary850239
 
AUDIENCE THEORY -CULTIVATION THEORY - GERBNER.pptx
AUDIENCE THEORY -CULTIVATION THEORY -  GERBNER.pptxAUDIENCE THEORY -CULTIVATION THEORY -  GERBNER.pptx
AUDIENCE THEORY -CULTIVATION THEORY - GERBNER.pptxiammrhaywood
 
Grade 9 Quarter 4 Dll Grade 9 Quarter 4 DLL.pdf
Grade 9 Quarter 4 Dll Grade 9 Quarter 4 DLL.pdfGrade 9 Quarter 4 Dll Grade 9 Quarter 4 DLL.pdf
Grade 9 Quarter 4 Dll Grade 9 Quarter 4 DLL.pdfJemuel Francisco
 
Field Attribute Index Feature in Odoo 17
Field Attribute Index Feature in Odoo 17Field Attribute Index Feature in Odoo 17
Field Attribute Index Feature in Odoo 17Celine George
 

Recently uploaded (20)

How to do quick user assign in kanban in Odoo 17 ERP
How to do quick user assign in kanban in Odoo 17 ERPHow to do quick user assign in kanban in Odoo 17 ERP
How to do quick user assign in kanban in Odoo 17 ERP
 
Inclusivity Essentials_ Creating Accessible Websites for Nonprofits .pdf
Inclusivity Essentials_ Creating Accessible Websites for Nonprofits .pdfInclusivity Essentials_ Creating Accessible Websites for Nonprofits .pdf
Inclusivity Essentials_ Creating Accessible Websites for Nonprofits .pdf
 
Incoming and Outgoing Shipments in 3 STEPS Using Odoo 17
Incoming and Outgoing Shipments in 3 STEPS Using Odoo 17Incoming and Outgoing Shipments in 3 STEPS Using Odoo 17
Incoming and Outgoing Shipments in 3 STEPS Using Odoo 17
 
Daily Lesson Plan in Mathematics Quarter 4
Daily Lesson Plan in Mathematics Quarter 4Daily Lesson Plan in Mathematics Quarter 4
Daily Lesson Plan in Mathematics Quarter 4
 
Food processing presentation for bsc agriculture hons
Food processing presentation for bsc agriculture honsFood processing presentation for bsc agriculture hons
Food processing presentation for bsc agriculture hons
 
Influencing policy (training slides from Fast Track Impact)
Influencing policy (training slides from Fast Track Impact)Influencing policy (training slides from Fast Track Impact)
Influencing policy (training slides from Fast Track Impact)
 
Barangay Council for the Protection of Children (BCPC) Orientation.pptx
Barangay Council for the Protection of Children (BCPC) Orientation.pptxBarangay Council for the Protection of Children (BCPC) Orientation.pptx
Barangay Council for the Protection of Children (BCPC) Orientation.pptx
 
Full Stack Web Development Course for Beginners
Full Stack Web Development Course  for BeginnersFull Stack Web Development Course  for Beginners
Full Stack Web Development Course for Beginners
 
Earth Day Presentation wow hello nice great
Earth Day Presentation wow hello nice greatEarth Day Presentation wow hello nice great
Earth Day Presentation wow hello nice great
 
Concurrency Control in Database Management system
Concurrency Control in Database Management systemConcurrency Control in Database Management system
Concurrency Control in Database Management system
 
Karra SKD Conference Presentation Revised.pptx
Karra SKD Conference Presentation Revised.pptxKarra SKD Conference Presentation Revised.pptx
Karra SKD Conference Presentation Revised.pptx
 
ROLES IN A STAGE PRODUCTION in arts.pptx
ROLES IN A STAGE PRODUCTION in arts.pptxROLES IN A STAGE PRODUCTION in arts.pptx
ROLES IN A STAGE PRODUCTION in arts.pptx
 
Choosing the Right CBSE School A Comprehensive Guide for Parents
Choosing the Right CBSE School A Comprehensive Guide for ParentsChoosing the Right CBSE School A Comprehensive Guide for Parents
Choosing the Right CBSE School A Comprehensive Guide for Parents
 
YOUVE_GOT_EMAIL_PRELIMS_EL_DORADO_2024.pptx
YOUVE_GOT_EMAIL_PRELIMS_EL_DORADO_2024.pptxYOUVE_GOT_EMAIL_PRELIMS_EL_DORADO_2024.pptx
YOUVE_GOT_EMAIL_PRELIMS_EL_DORADO_2024.pptx
 
4.18.24 Movement Legacies, Reflection, and Review.pptx
4.18.24 Movement Legacies, Reflection, and Review.pptx4.18.24 Movement Legacies, Reflection, and Review.pptx
4.18.24 Movement Legacies, Reflection, and Review.pptx
 
AUDIENCE THEORY -CULTIVATION THEORY - GERBNER.pptx
AUDIENCE THEORY -CULTIVATION THEORY -  GERBNER.pptxAUDIENCE THEORY -CULTIVATION THEORY -  GERBNER.pptx
AUDIENCE THEORY -CULTIVATION THEORY - GERBNER.pptx
 
Grade 9 Quarter 4 Dll Grade 9 Quarter 4 DLL.pdf
Grade 9 Quarter 4 Dll Grade 9 Quarter 4 DLL.pdfGrade 9 Quarter 4 Dll Grade 9 Quarter 4 DLL.pdf
Grade 9 Quarter 4 Dll Grade 9 Quarter 4 DLL.pdf
 
FINALS_OF_LEFT_ON_C'N_EL_DORADO_2024.pptx
FINALS_OF_LEFT_ON_C'N_EL_DORADO_2024.pptxFINALS_OF_LEFT_ON_C'N_EL_DORADO_2024.pptx
FINALS_OF_LEFT_ON_C'N_EL_DORADO_2024.pptx
 
Raw materials used in Herbal Cosmetics.pptx
Raw materials used in Herbal Cosmetics.pptxRaw materials used in Herbal Cosmetics.pptx
Raw materials used in Herbal Cosmetics.pptx
 
Field Attribute Index Feature in Odoo 17
Field Attribute Index Feature in Odoo 17Field Attribute Index Feature in Odoo 17
Field Attribute Index Feature in Odoo 17
 

Introduction to Artificial Neural Networks (ANNs) - Step-by-Step Training & Testing Example 2

  • 1. Artificial Neural Networks (ANNs) Step-By-Step Training & Testing Example 2 MENOUFIA UNIVERSITY FACULTY OF COMPUTERS AND INFORMATION ALL DEPARTMENTS ARTIFICIAL INTELLIGENCE ‫المنوفية‬ ‫جامعة‬ ‫والمعلومات‬ ‫الحاسبات‬ ‫كلية‬ ‫األقسام‬ ‫جميع‬ ‫الذكاء‬‫اإلصطناعي‬ ‫المنوفية‬ ‫جامعة‬ Ahmed Fawzy Gad ahmed.fawzy@ci.menofia.edu.eg
  • 2. Classification Example 𝑭 𝟐𝑭 𝟏 16.8121 C1 15.2114 9.4210 C2 8.1195
  • 3. Neural Networks 𝑭 𝟐𝑭 𝟏 16.8121 C1 15.2114 9.4210 C2 8.1195 Input Hidden Output
  • 4. Neural Networks 𝑭 𝟐𝑭 𝟏 16.8121 C1 15.2114 9.4210 C2 8.1195 0 50 100 150 200 250 0 5 10 15 20
  • 5. Input Layer Input Output 𝑭 𝟏 𝑭 𝟐 𝑭 𝟐𝑭 𝟏 16.8121 C1 15.2114 9.4210 C2 8.1195 0 50 100 150 200 250 0 5 10 15 20
  • 6. Output Layer Input Output C1/C2 𝒀𝒋 𝑭 𝟐𝑭 𝟏 16.8121 C1 15.2114 9.4210 C2 8.1195 0 50 100 150 200 250 0 5 10 15 20 𝑭 𝟏 𝑭 𝟐
  • 7. Weights Input Output Weights=𝑾𝒊 𝑭 𝟐𝑭 𝟏 16.8121 C1 15.2114 9.4210 C2 8.1195 0 50 100 150 200 250 0 5 10 15 20 C1/C2 𝒀𝒋 𝑭 𝟏 𝑭 𝟐 𝑾 𝟏 𝑾 𝟐
  • 8. Activation Function Input Output 𝑭 𝟐𝑭 𝟏 16.8121 C1 15.2114 9.4210 C2 8.1195 0 50 100 150 200 250 0 5 10 15 20 C1/C2 𝒀𝒋 𝑭 𝟏 𝑭 𝟐 𝑾 𝟏 𝑾 𝟐
  • 9. Activation Function Input Output 𝑭 𝟐𝑭 𝟏 16.8121 C1 15.2114 9.4210 C2 8.1195 0 50 100 150 200 250 0 5 10 15 20 C1/C2 𝒀𝒋 𝑭 𝟏 𝑭 𝟐 𝑾 𝟏 𝑾 𝟐
  • 10. Activation Function Input Output 𝑭 𝟐𝑭 𝟏 16.8121 C1 15.2114 9.4210 C2 8.1195 0 50 100 150 200 250 0 5 10 15 20 C1/C2 𝒀𝒋 𝑭 𝟏 𝑭 𝟐 𝑾 𝟏 𝑾 𝟐
  • 11. Activation Function Components Input Output 𝑭 𝟐𝑭 𝟏 16.8121 C1 15.2114 9.4210 C2 8.1195 0 50 100 150 200 250 0 5 10 15 20 C1/C2 𝒀𝒋 𝑭 𝟏 𝑭 𝟐 𝑾 𝟏 𝑾 𝟐
  • 12. Activation Function Inputs Input Output s 𝑭 𝟐𝑭 𝟏 16.8121 C1 15.2114 9.4210 C2 8.1195 0 50 100 150 200 250 0 5 10 15 20 C1/C2 𝒀𝒋 𝑭 𝟏 𝑭 𝟐 𝑾 𝟏 𝑾 𝟐
  • 13. Activation Function Inputs Input Output ss=SOP(𝑿𝒊, 𝑾𝒊) 𝑭 𝟐𝑭 𝟏 16.8121 C1 15.2114 9.4210 C2 8.1195 0 50 100 150 200 250 0 5 10 15 20 C1/C2 𝒀𝒋 𝑭 𝟏 𝑭 𝟐 𝑾 𝟏 𝑾 𝟐
  • 14. Activation Function Inputs Input Output ss=SOP(𝑿𝒊, 𝑾𝒊) 𝑿𝒊=Inputs 𝑾𝒊=Weights 𝑭 𝟐𝑭 𝟏 16.8121 C1 15.2114 9.4210 C2 8.1195 0 50 100 150 200 250 0 5 10 15 20 C1/C2 𝒀𝒋 𝑭 𝟏 𝑭 𝟐 𝑾 𝟏 𝑾 𝟐
  • 15. Activation Function Inputs Input Output ss=SOP(𝑿𝒊, 𝑾𝒊) 𝑿𝒊=Inputs 𝑾𝒊=Weights 𝑿 𝟏 𝑿 𝟐 𝑭 𝟐𝑭 𝟏 16.8121 C1 15.2114 9.4210 C2 8.1195 0 50 100 150 200 250 0 5 10 15 20 C1/C2 𝒀𝒋 𝑭 𝟏 𝑭 𝟐 𝑾 𝟏 𝑾 𝟐
  • 16. Activation Function Inputs Input Output ss=SOP(𝑿𝒊, 𝑾𝒊) 𝑿𝒊=Inputs 𝑾𝒊=Weights S= 𝟏 𝒎 𝑿𝒊 𝑾𝒊 𝑭 𝟐𝑭 𝟏 16.8121 C1 15.2114 9.4210 C2 8.1195 0 50 100 150 200 250 0 5 10 15 20 𝑿 𝟏 𝑿 𝟐 C1/C2 𝒀𝒋 𝑭 𝟏 𝑭 𝟐 𝑾 𝟏 𝑾 𝟐
  • 17. Activation Function Inputs Input Output ss=SOP(𝑿𝒊, 𝑾𝒊) 𝑿𝒊=Inputs 𝑾𝒊=Weights s=(𝑿 𝟏 𝑾 𝟏+𝑿 𝟐 𝑾 𝟐) 𝑭 𝟐𝑭 𝟏 16.8121 C1 15.2114 9.4210 C2 8.1195 0 50 100 150 200 250 0 5 10 15 20 𝑿 𝟏 𝑿 𝟐 C1/C2 𝒀𝒋 𝑭 𝟏 𝑭 𝟐 𝑾 𝟏 𝑾 𝟐
  • 18. Activation Function Outputs Input Output F(s)s Class Label 𝑭 𝟐𝑭 𝟏 16.8121 C1 15.2114 9.4210 C2 8.1195 0 50 100 150 200 250 0 5 10 15 20 𝑿 𝟏 𝑿 𝟐 C1/C2 𝒀𝒋 𝑭 𝟏 𝑭 𝟐 𝑾 𝟏 𝑾 𝟐
  • 20. Activation Functions Which activation function to use? Outputs Class Labels Activation Function TWO Class Labels TWO Outputs One that gives two outputs. Which activation function to use? 𝑪𝒋𝒀𝒋 𝑭 𝟐𝑭 𝟏 16.8121 C1 15.2114 9.4210 C2 8.1195
  • 22. Activation Function Input Output F(s)s sgn 𝑭 𝟐𝑭 𝟏 16.8121 C1 15.2114 9.4210 C2 8.1195 0 50 100 150 200 250 0 5 10 15 20 𝑿 𝟏 𝑿 𝟐 C1/C2 𝒀𝒋 𝑭 𝟏 𝑭 𝟐 𝑾 𝟏 𝑾 𝟐
  • 23. Bias Input Output F(s)s sgn =+1𝑿 𝟎 W0 s=(𝑿 𝟏 𝑾 𝟏+𝑿 𝟐 𝑾 𝟐) 𝑭 𝟐𝑭 𝟏 16.8121 C1 15.2114 9.4210 C2 8.1195 𝑿 𝟏 𝑿 𝟐 C1/C2 𝒀𝒋 𝑭 𝟏 𝑭 𝟐 𝑾 𝟏 𝑾 𝟐
  • 24. Bias Input Output F(s)s sgn s=(𝑿 𝟏 𝑾 𝟏+𝑿 𝟐 𝑾 𝟐) s=(𝑿 𝟎 𝑾 𝟎+𝑿 𝟏 𝑾 𝟏+𝑿 𝟐 𝑾 𝟐) 𝑭 𝟐𝑭 𝟏 16.8121 C1 15.2114 9.4210 C2 8.1195 =+1𝑿 𝟎 W0 𝑿 𝟏 𝑿 𝟐 C1/C2 𝒀𝒋 𝑭 𝟏 𝑭 𝟐 𝑾 𝟏 𝑾 𝟐
  • 25. Bias Input Output F(s)s sgn s=(𝑿 𝟏 𝑾 𝟏+𝑿 𝟐 𝑾 𝟐) s=(𝑿 𝟎 𝑾 𝟎+𝑿 𝟏 𝑾 𝟏+𝑿 𝟐 𝑾 𝟐) 𝑿 𝟎 = +𝟏 𝑭 𝟐𝑭 𝟏 16.8121 C1 15.2114 9.4210 C2 8.1195 =+1𝑿 𝟎 W0 𝑿 𝟏 𝑿 𝟐 C1/C2 𝒀𝒋 𝑭 𝟏 𝑭 𝟐 𝑾 𝟏 𝑾 𝟐
  • 26. Bias Input Output F(s)s sgn s=(𝑿 𝟏 𝑾 𝟏+𝑿 𝟐 𝑾 𝟐) s=(𝑿 𝟎 𝑾 𝟎+𝑿 𝟏 𝑾 𝟏+𝑿 𝟐 𝑾 𝟐) s=(+𝟏 ∗ 𝑾 𝟎+𝑿 𝟏 𝑾 𝟏+𝑿 𝟐 𝑾 𝟐) 𝑭 𝟐𝑭 𝟏 16.8121 C1 15.2114 9.4210 C2 8.1195 =+1𝑿 𝟎 W0 𝑿 𝟏 𝑿 𝟐 C1/C2 𝒀𝒋 𝑭 𝟏 𝑭 𝟐 𝑾 𝟏 𝑾 𝟐
  • 27. Bias Input Output F(s)s sgn s=(𝑿 𝟏 𝑾 𝟏+𝑿 𝟐 𝑾 𝟐) s=(𝑿 𝟎 𝑾 𝟎+𝑿 𝟏 𝑾 𝟏+𝑿 𝟐 𝑾 𝟐) s=(𝑾 𝟎+𝑿 𝟏 𝑾 𝟏+𝑿 𝟐 𝑾 𝟐) 𝑭 𝟐𝑭 𝟏 16.8121 C1 15.2114 9.4210 C2 8.1195 =+1𝑿 𝟎 W0 𝑿 𝟏 𝑿 𝟐 C1/C2 𝒀𝒋 𝑭 𝟏 𝑭 𝟐 𝑾 𝟏 𝑾 𝟐
  • 28. Bias Importance Input Output F(s)s sgn s=(𝑿 𝟏 𝑾 𝟏+𝑿 𝟐 𝑾 𝟐) s=(𝑾 𝟎+𝑿 𝟏 𝑾 𝟏+𝑿 𝟐 𝑾 𝟐) 𝑭 𝟐𝑭 𝟏 16.8121 C1 15.2114 9.4210 C2 8.1195 =+1𝑿 𝟎 W0 𝑿 𝟏 𝑿 𝟐 C1/C2 𝒀𝒋 𝑭 𝟏 𝑭 𝟐 𝑾 𝟏 𝑾 𝟐
  • 29. Bias Importance Input Output F(s)s sgn s=(𝑿 𝟏 𝑾 𝟏+𝑿 𝟐 𝑾 𝟐) s=(𝑾 𝟎+𝑿 𝟏 𝑾 𝟏+𝑿 𝟐 𝑾 𝟐) X Y 𝑭 𝟐𝑭 𝟏 16.8121 C1 15.2114 9.4210 C2 8.1195 =+1𝑿 𝟎 W0 𝑿 𝟏 𝑿 𝟐 C1/C2 𝒀𝒋 𝑭 𝟏 𝑭 𝟐 𝑾 𝟏 𝑾 𝟐
  • 30. Bias Importance Input Output F(s)s sgn s=(𝑿 𝟏 𝑾 𝟏+𝑿 𝟐 𝑾 𝟐) s=(𝑾 𝟎+𝑿 𝟏 𝑾 𝟏+𝑿 𝟐 𝑾 𝟐) X Y y=ax+b 𝑭 𝟐𝑭 𝟏 16.8121 C1 15.2114 9.4210 C2 8.1195 =+1𝑿 𝟎 W0 𝑿 𝟏 𝑿 𝟐 C1/C2 𝒀𝒋 𝑭 𝟏 𝑭 𝟐 𝑾 𝟏 𝑾 𝟐
  • 31. Bias Importance Input Output F(s)s sgn s=(𝑿 𝟏 𝑾 𝟏+𝑿 𝟐 𝑾 𝟐) s=(𝑾 𝟎+𝑿 𝟏 𝑾 𝟏+𝑿 𝟐 𝑾 𝟐) X Y y=x+b 𝑭 𝟐𝑭 𝟏 16.8121 C1 15.2114 9.4210 C2 8.1195 =+1𝑿 𝟎 W0 𝑿 𝟏 𝑿 𝟐 C1/C2 𝒀𝒋 𝑭 𝟏 𝑭 𝟐 𝑾 𝟏 𝑾 𝟐
  • 32. Bias Importance Input Output F(s)s sgn s=(𝑿 𝟏 𝑾 𝟏+𝑿 𝟐 𝑾 𝟐) s=(𝑾 𝟎+𝑿 𝟏 𝑾 𝟏+𝑿 𝟐 𝑾 𝟐) X Y y=x+b Y-Intercept 𝑭 𝟐𝑭 𝟏 16.8121 C1 15.2114 9.4210 C2 8.1195 =+1𝑿 𝟎 W0 𝑿 𝟏 𝑿 𝟐 C1/C2 𝒀𝒋 𝑭 𝟏 𝑭 𝟐 𝑾 𝟏 𝑾 𝟐
  • 33. Bias Importance Input Output F(s)s sgn s=(𝑿 𝟏 𝑾 𝟏+𝑿 𝟐 𝑾 𝟐) s=(𝑾 𝟎+𝑿 𝟏 𝑾 𝟏+𝑿 𝟐 𝑾 𝟐) X Y y=x+b Y-Intercept b=0 𝑭 𝟐𝑭 𝟏 16.8121 C1 15.2114 9.4210 C2 8.1195 =+1𝑿 𝟎 W0 𝑿 𝟏 𝑿 𝟐 C1/C2 𝒀𝒋 𝑭 𝟏 𝑭 𝟐 𝑾 𝟏 𝑾 𝟐
  • 34. Bias Importance Input Output F(s)s sgn s=(𝑿 𝟏 𝑾 𝟏+𝑿 𝟐 𝑾 𝟐) s=(𝑾 𝟎+𝑿 𝟏 𝑾 𝟏+𝑿 𝟐 𝑾 𝟐) X Y y=x+b Y-Intercept b=0 𝑭 𝟐𝑭 𝟏 16.8121 C1 15.2114 9.4210 C2 8.1195 =+1𝑿 𝟎 W0 𝑿 𝟏 𝑿 𝟐 C1/C2 𝒀𝒋 𝑭 𝟏 𝑭 𝟐 𝑾 𝟏 𝑾 𝟐
  • 35. Bias Importance Input Output F(s)s sgn s=(𝑿 𝟏 𝑾 𝟏+𝑿 𝟐 𝑾 𝟐) s=(𝑾 𝟎+𝑿 𝟏 𝑾 𝟏+𝑿 𝟐 𝑾 𝟐) X Y y=x+b Y-Intercept b=+v 𝑭 𝟐𝑭 𝟏 16.8121 C1 15.2114 9.4210 C2 8.1195 =+1𝑿 𝟎 W0 𝑿 𝟏 𝑿 𝟐 C1/C2 𝒀𝒋 𝑭 𝟏 𝑭 𝟐 𝑾 𝟏 𝑾 𝟐
  • 36. Bias Importance Input Output F(s)s sgn s=(𝑿 𝟏 𝑾 𝟏+𝑿 𝟐 𝑾 𝟐) s=(𝑾 𝟎+𝑿 𝟏 𝑾 𝟏+𝑿 𝟐 𝑾 𝟐) X Y y=x+b Y-Intercept b=+v 𝑭 𝟐𝑭 𝟏 16.8121 C1 15.2114 9.4210 C2 8.1195 =+1𝑿 𝟎 W0 𝑿 𝟏 𝑿 𝟐 C1/C2 𝒀𝒋 𝑭 𝟏 𝑭 𝟐 𝑾 𝟏 𝑾 𝟐
  • 37. Bias Importance Input Output F(s)s sgn s=(𝑿 𝟏 𝑾 𝟏+𝑿 𝟐 𝑾 𝟐) s=(𝑾 𝟎+𝑿 𝟏 𝑾 𝟏+𝑿 𝟐 𝑾 𝟐) X Y y=x+b Y-Intercept b=-v 𝑭 𝟐𝑭 𝟏 16.8121 C1 15.2114 9.4210 C2 8.1195 =+1𝑿 𝟎 W0 𝑿 𝟏 𝑿 𝟐 C1/C2 𝒀𝒋 𝑭 𝟏 𝑭 𝟐 𝑾 𝟏 𝑾 𝟐
  • 38. Bias Importance Input Output F(s)s sgn s=(𝑿 𝟏 𝑾 𝟏+𝑿 𝟐 𝑾 𝟐) s=(𝑾 𝟎+𝑿 𝟏 𝑾 𝟏+𝑿 𝟐 𝑾 𝟐) X Y y=x+b Y-Intercept b=-v 𝑭 𝟐𝑭 𝟏 16.8121 C1 15.2114 9.4210 C2 8.1195 =+1𝑿 𝟎 W0 𝑿 𝟏 𝑿 𝟐 C1/C2 𝒀𝒋 𝑭 𝟏 𝑭 𝟐 𝑾 𝟏 𝑾 𝟐
  • 39. Bias Importance Input Output F(s)s sgn s=(𝑿 𝟏 𝑾 𝟏+𝑿 𝟐 𝑾 𝟐) s=(𝑾 𝟎+𝑿 𝟏 𝑾 𝟏+𝑿 𝟐 𝑾 𝟐) Same Concept Applies to Bias S= 𝟏 𝒎 𝑿𝒊 𝑾𝒊+BIAS 𝑭 𝟐𝑭 𝟏 16.8121 C1 15.2114 9.4210 C2 8.1195 =+1𝑿 𝟎 W0 𝑿 𝟏 𝑿 𝟐 C1/C2 𝒀𝒋 𝑭 𝟏 𝑭 𝟐 𝑾 𝟏 𝑾 𝟐
  • 40. Bias Importance Input Output F(s)s sgn s=(𝑿 𝟏 𝑾 𝟏+𝑿 𝟐 𝑾 𝟐) s=(𝑾 𝟎+𝑿 𝟏 𝑾 𝟏+𝑿 𝟐 𝑾 𝟐) S= 𝟏 𝒎 𝑿𝒊 𝑾𝒊+BIAS 𝑭 𝟐𝑭 𝟏 16.8121 C1 15.2114 9.4210 C2 8.1195 =+1𝑿 𝟎 W0 𝑿 𝟏 𝑿 𝟐 C1/C2 𝒀𝒋 𝑭 𝟏 𝑭 𝟐 𝑾 𝟏 𝑾 𝟐
  • 41. Bias Importance Input Output F(s)s sgn s=(𝑿 𝟏 𝑾 𝟏+𝑿 𝟐 𝑾 𝟐) s=(𝑾 𝟎+𝑿 𝟏 𝑾 𝟏+𝑿 𝟐 𝑾 𝟐) S= 𝟏 𝒎 𝑿𝒊 𝑾𝒊+BIAS 𝑭 𝟐𝑭 𝟏 16.8121 C1 15.2114 9.4210 C2 8.1195 =+1𝑿 𝟎 W0 𝑿 𝟏 𝑿 𝟐 C1/C2 𝒀𝒋 𝑭 𝟏 𝑭 𝟐 𝑾 𝟏 𝑾 𝟐
  • 42. Bias Importance Input Output F(s)s sgn s=(𝑿 𝟏 𝑾 𝟏+𝑿 𝟐 𝑾 𝟐) s=(𝑾 𝟎+𝑿 𝟏 𝑾 𝟏+𝑿 𝟐 𝑾 𝟐) S= 𝟏 𝒎 𝑿𝒊 𝑾𝒊+BIAS 𝑭 𝟐𝑭 𝟏 16.8121 C1 15.2114 9.4210 C2 8.1195 =+1𝑿 𝟎 W0 𝑿 𝟏 𝑿 𝟐 C1/C2 𝒀𝒋 𝑭 𝟏 𝑭 𝟐 𝑾 𝟏 𝑾 𝟐
  • 43. Bias Importance Input Output F(s)s sgn s=(𝑿 𝟏 𝑾 𝟏+𝑿 𝟐 𝑾 𝟐) s=(𝑾 𝟎+𝑿 𝟏 𝑾 𝟏+𝑿 𝟐 𝑾 𝟐) S= 𝟏 𝒎 𝑿𝒊 𝑾𝒊+BIAS 𝑭 𝟐𝑭 𝟏 16.8121 C1 15.2114 9.4210 C2 8.1195 =+1𝑿 𝟎 W0 𝑿 𝟏 𝑿 𝟐 C1/C2 𝒀𝒋 𝑭 𝟏 𝑭 𝟐 𝑾 𝟏 𝑾 𝟐
  • 44. Learning Rate F(s)s sgn s=(𝑿 𝟎 𝑾 𝟎+𝑿 𝟏 𝑾 𝟏+𝑿 𝟐 𝑾 𝟐) 𝟎 ≤ η ≤ 𝟏=+1𝑿 𝟎 W0 𝑿 𝟏 𝑿 𝟐 𝑭 𝟏 𝑭 𝟐 𝑾 𝟏 𝑾 𝟐 C1/C2 𝒀𝒋
  • 45. Summary of Parameters Inputs 𝑿 𝒎 s=(𝑿 𝟎 𝑾 𝟎+𝑿 𝟏 𝑾 𝟏+𝑿 𝟐 𝑾 𝟐) 𝟎 ≤ η ≤ 𝟏 𝑿(𝒏)=(𝑿 𝟎, 𝑿 𝟏,𝑿 𝟐) F(s)s sgn =+1𝑿 𝟎 W0 𝑿 𝟏 𝑿 𝟐 𝑭 𝟏 𝑭 𝟐 𝑾 𝟏 𝑾 𝟐 C1/C2 𝒀𝒋
  • 46. Summary of Parameters Weights 𝑾 𝒎 s=(𝑿 𝟎 𝑾 𝟎+𝑿 𝟏 𝑾 𝟏+𝑿 𝟐 𝑾 𝟐) 𝟎 ≤ η ≤ 𝟏 W(𝒏)=(𝑾 𝟎, 𝑾 𝟏,𝑾 𝟐) 𝑿(𝒏)=(𝑿 𝟎, 𝑿 𝟏,𝑿 𝟐) F(s)s sgn =+1𝑿 𝟎 W0 𝑿 𝟏 𝑿 𝟐 𝑭 𝟏 𝑭 𝟐 𝑾 𝟏 𝑾 𝟐 C1/C2 𝒀𝒋
  • 47. Summary of Parameters Bias 𝒃 s=(𝑿 𝟎 𝑾 𝟎+𝑿 𝟏 𝑾 𝟏+𝑿 𝟐 𝑾 𝟐) 𝟎 ≤ η ≤ 𝟏 𝑿(𝒏)=(𝑿 𝟎, 𝑿 𝟏,𝑿 𝟐) W(𝒏)=(𝑾 𝟎, 𝑾 𝟏,𝑾 𝟐) F(s)s sgn =+1𝑿 𝟎 W0 𝑿 𝟏 𝑿 𝟐 𝑭 𝟏 𝑭 𝟐 𝑾 𝟏 𝑾 𝟐 C1/C2 𝒀𝒋
  • 48. Summary of Parameters Sum Of Products (SOP) 𝒔 s=(𝑿 𝟎 𝑾 𝟎+𝑿 𝟏 𝑾 𝟏+𝑿 𝟐 𝑾 𝟐) 𝟎 ≤ η ≤ 𝟏 𝑿(𝒏)=(𝑿 𝟎, 𝑿 𝟏,𝑿 𝟐) W(𝒏)=(𝑾 𝟎, 𝑾 𝟏,𝑾 𝟐) F(s)s sgn =+1𝑿 𝟎 W0 𝑿 𝟏 𝑿 𝟐 𝑭 𝟏 𝑭 𝟐 𝑾 𝟏 𝑾 𝟐 C1/C2 𝒀𝒋
  • 49. Summary of Parameters Activation Function 𝒔𝒈𝒏 s=(𝑿 𝟎 𝑾 𝟎+𝑿 𝟏 𝑾 𝟏+𝑿 𝟐 𝑾 𝟐) 𝟎 ≤ η ≤ 𝟏 𝑿(𝒏)=(𝑿 𝟎, 𝑿 𝟏,𝑿 𝟐) W(𝒏)=(𝑾 𝟎, 𝑾 𝟏,𝑾 𝟐) F(s)s sgn =+1𝑿 𝟎 W0 𝑿 𝟏 𝑿 𝟐 𝑭 𝟏 𝑭 𝟐 𝑾 𝟏 𝑾 𝟐 C1/C2 𝒀𝒋
  • 50. Summary of Parameters Outputs 𝒀𝒋 s=(𝑿 𝟎 𝑾 𝟎+𝑿 𝟏 𝑾 𝟏+𝑿 𝟐 𝑾 𝟐) 𝟎 ≤ η ≤ 𝟏 𝑿(𝒏)=(𝑿 𝟎, 𝑿 𝟏,𝑿 𝟐) W(𝒏)=(𝑾 𝟎, 𝑾 𝟏,𝑾 𝟐) F(s)s sgn =+1𝑿 𝟎 W0 𝑿 𝟏 𝑿 𝟐 𝑭 𝟏 𝑭 𝟐 𝑾 𝟏 𝑾 𝟐 C1/C2 𝒀𝒋
  • 51. Summary of Parameters Learning Rate η s=(𝑿 𝟎 𝑾 𝟎+𝑿 𝟏 𝑾 𝟏+𝑿 𝟐 𝑾 𝟐) 𝟎 ≤ η ≤ 𝟏 𝑿(𝒏)=(𝑿 𝟎, 𝑿 𝟏,𝑿 𝟐) W(𝒏)=(𝑾 𝟎, 𝑾 𝟏,𝑾 𝟐) F(s)s sgn =+1𝑿 𝟎 W0 𝑿 𝟏 𝑿 𝟐 𝑭 𝟏 𝑭 𝟐 𝑾 𝟏 𝑾 𝟐 C1/C2 𝒀𝒋
  • 52. Other Parameters Step n s=(𝑿 𝟎 𝑾 𝟎+𝑿 𝟏 𝑾 𝟏+𝑿 𝟐 𝑾 𝟐) 𝟎 ≤ η ≤ 𝟏𝒏 = 𝟎, 𝟏, 𝟐, … 𝑿(𝒏)=(𝑿 𝟎, 𝑿 𝟏,𝑿 𝟐) W(𝒏)=(𝑾 𝟎, 𝑾 𝟏,𝑾 𝟐) F(s)s sgn =+1𝑿 𝟎 W0 𝑿 𝟏 𝑿 𝟐 𝑭 𝟏 𝑭 𝟐 𝑾 𝟏 𝑾 𝟐 C1/C2 𝒀𝒋
  • 53. Other Parameters Desired Output 𝒅𝒋 s=(𝑿 𝟎 𝑾 𝟎+𝑿 𝟏 𝑾 𝟏+𝑿 𝟐 𝑾 𝟐) 𝟎 ≤ η ≤ 𝟏𝒏 = 𝟎, 𝟏, 𝟐, … 𝒅 𝒏 = +𝟏, 𝒙 𝒏 𝒃𝒆𝒍𝒐𝒏𝒈𝒔 𝒕𝒐 𝑪𝟏 −𝟏, 𝒙 𝒏 𝒃𝒆𝒍𝒐𝒏𝒈𝒔 𝒕𝒐 𝑪𝟐 𝑿(𝒏)=(𝑿 𝟎, 𝑿 𝟏,𝑿 𝟐) W(𝒏)=(𝑾 𝟎, 𝑾 𝟏,𝑾 𝟐) 𝑭 𝟐𝑭 𝟏 16.8121 C1 15.2114 9.4210 C2 8.1195 F(s)s sgn =+1𝑿 𝟎 W0 𝑿 𝟏 𝑿 𝟐 𝑭 𝟏 𝑭 𝟐 𝑾 𝟏 𝑾 𝟐 C1/C2 𝒀𝒋
  • 54. Neural Networks Training Steps Weights Initialization Inputs Application Sum of Inputs-Weights Products Activation Function Response Calculation Weights Adaptation Back to Step 2 1 2 3 4 5 6
  • 55. Regarding 5th Step: Weights Adaptation • If the predicted output Y is not the same as the desired output d, then weights are to be adapted according to the following equation: 𝑾 𝒏 + 𝟏 = 𝑾 𝒏 + η 𝒅 𝒏 − 𝒀 𝒏 𝑿(𝒏) Where 𝑾 𝒏 = [𝒃 𝒏 , 𝑾 𝟏(𝒏), 𝑾 𝟐(𝒏), 𝑾 𝟑(𝒏), … , 𝑾 𝒎(𝒏)]
  • 56. Neural Networks Training Example Step n=0 • In each step in the solution, the parameters of the neural network must be known. • Parameters of step n=0: η = .01 𝑋 𝑛 = 𝑋 0 = +1, 121, 16.8 𝑊 𝑛 = 𝑊 0 = −1230, −30, 300 𝑑 𝑛 = 𝑑 0 = +1 𝑭 𝟐𝑭 𝟏 16.8121 C1 = +1 15.2114 9.4210 C2 = -1 8.1195
  • 57. Neural Networks Training Example Step n=0 F(s)s sgn =+1𝑿 𝟎 -1230 𝑿 𝟏 𝑿 𝟐 121 𝟏𝟔. 𝟖 -30 300 𝑭 𝟐𝑭 𝟏 16.8121 C1 = +1 15.2114 9.4210 C2 = -1 8.1195 C1/C2 𝒀(𝒏)
  • 58. Neural Networks Training Example Step n=0 - SOP s=(𝑿 𝟎 𝑾 𝟎+𝑿 𝟏 𝑾 𝟏+𝑿 𝟐 𝑾 𝟐) =+1*-1230+121*-30+16.8*300 =180 𝑭 𝟐𝑭 𝟏 16.8121 C1 = +1 15.2114 9.4210 C2 = -1 8.1195 F(s)s sgn =+1𝑿 𝟎 -1230 𝑿 𝟏 𝑿 𝟐 121 𝟏𝟔. 𝟖 -30 300 C1/C2 𝒀(𝒏)
  • 59. Neural Networks Training Example Step n=0 - Output 𝒀 𝒏 = 𝒀 𝟎 = 𝑺𝑮𝑵 𝒔 = 𝑺𝑮𝑵 𝟏𝟖𝟎 = +𝟏 𝒔𝒈𝒏 𝒔 = +𝟏, 𝒔 ≥ 𝟎 −𝟏, 𝒔 < 𝟎 𝑭 𝟐𝑭 𝟏 16.8121 C1 = +1 15.2114 9.4210 C2 = -1 8.1195 F(s)s sgn =+1𝑿 𝟎 -1230 𝑿 𝟏 𝑿 𝟐 121 𝟏𝟔. 𝟖 -30 300 C1/C2 𝒀(𝒏)
  • 60. Neural Networks Training Example Step n=0 - Output 𝑭 𝟐𝑭 𝟏 16.8121 C1 = +1 15.2114 9.4210 C2 = -1 8.1195 F(s)s sgn =+1𝑿 𝟎 -1230 𝑿 𝟏 𝑿 𝟐 121 𝟏𝟔. 𝟖 -30 300 𝒀 𝒏 = 𝒀 𝟎 = 𝑺𝑮𝑵 𝒔 = 𝑺𝑮𝑵 𝟏𝟖𝟎 = +𝟏 C1 +1
  • 61. Neural Networks Training Example Step n=0 Predicted Vs. Desired 𝒀 𝒏 = 𝒀 𝟎 = +𝟏 𝐝 𝒏 = 𝒅 𝟎 = +𝟏 ∵ 𝒀 𝒏 = 𝒅 𝒏 ∴ Weights are Correct. No Adaptation 𝑭 𝟐𝑭 𝟏 16.8121 C1 = +1 15.2114 9.4210 C2 = -1 8.1195 F(s)s sgn =+1𝑿 𝟎 -1230 𝑿 𝟏 𝑿 𝟐 121 𝟏𝟔. 𝟖 -30 300 C1 +1
  • 62. Neural Networks Training Example Step n=1 • In each step in the solution, the parameters of the neural network must be known. • Parameters of step n=1: η = .01 𝑋 𝑛 = 𝑋 1 = +1, 121, 16.8 𝑊 𝑛 = 𝑊 1 = 𝑊 0 = −1230, −30, 300 𝑑 𝑛 = 𝑑 1 = +1 𝑭 𝟐𝑭 𝟏 16.8121 C1 = +1 15.2114 9.4210 C2 = -1 8.1195
  • 63. Neural Networks Training Example Step n=1 F(s)s sgn =+1𝑿 𝟎 -1230 𝑿 𝟏 𝑿 𝟐 114 𝟏𝟓. 𝟐 -30 300 𝑭 𝟐𝑭 𝟏 16.8121 C1 = +1 15.2114 9.4210 C2 = -1 8.1195 C1/C2 𝒀(𝒏)
  • 64. Neural Networks Training Example Step n=1 - SOP s=(𝑿 𝟎 𝑾 𝟎+𝑿 𝟏 𝑾 𝟏+𝑿 𝟐 𝑾 𝟐) =+1*-1230+114*-30+15.2*300 =-90 𝑭 𝟐𝑭 𝟏 16.8121 C1 = +1 15.2114 9.4210 C2 = -1 8.1195 F(s)s sgn =+1𝑿 𝟎 -1230 𝑿 𝟏 𝑿 𝟐 114 𝟏𝟓. 𝟐 -30 300 C1/C2 𝒀(𝒏)
  • 65. Neural Networks Training Example Step n=1 - Output 𝒀 𝒏 = 𝒀 𝟏 = 𝑺𝑮𝑵 𝒔 = 𝑺𝑮𝑵 −𝟗𝟎 = −𝟏 𝒔𝒈𝒏 𝒔 = +𝟏, 𝒔 ≥ 𝟎 −𝟏, 𝒔 < 𝟎 𝑭 𝟐𝑭 𝟏 16.8121 C1 = +1 15.2114 9.4210 C2 = -1 8.1195 F(s)s sgn =+1𝑿 𝟎 -1230 𝑿 𝟏 𝑿 𝟐 114 𝟏𝟓. 𝟐 -30 300 C1/C2 𝒀(𝒏)
  • 66. Neural Networks Training Example Step n=1 - Output 𝑭 𝟐𝑭 𝟏 16.8121 C1 = +1 15.2114 9.4210 C2 = -1 8.1195 F(s)s sgn =+1𝑿 𝟎 -1230 𝑿 𝟏 𝑿 𝟐 114 𝟏𝟓. 𝟐 -30 300 𝒀 𝒏 = 𝒀 𝟏 = 𝑺𝑮𝑵 𝒔 = 𝑺𝑮𝑵 −𝟗𝟎 = −𝟏 C2 -1
  • 67. Neural Networks Training Example Step n=1 Predicted Vs. Desired 𝒀 𝒏 = 𝒀 𝟏 = −𝟏 𝐝 𝒏 = 𝒅 𝟏 = +𝟏 ∵ 𝒀 𝒏 ≠ 𝒅 𝒏 ∴ Weights are Incorrect. Adaptation Required. 𝑭 𝟐𝑭 𝟏 16.8121 C1 = +1 15.2114 9.4210 C2 = -1 8.1195 F(s)s sgn =+1𝑿 𝟎 -1230 𝑿 𝟏 𝑿 𝟐 114 𝟏𝟓. 𝟐 -30 300 C2 -1
  • 68. Weights Adaptation • According to 𝑾 𝒏 + 𝟏 = 𝑾 𝒏 + η 𝒅 𝒏 − 𝒀 𝒏 𝑿(𝒏) • Where n = 1 𝑾 𝟏 + 𝟏 = 𝑾 𝟏 + η 𝒅 𝟏 − 𝒀 𝟏 𝑿(𝟏) 𝑾 𝟐 = −𝟏𝟐𝟑𝟎, 𝟑𝟎, 𝟑𝟎𝟎 + .01 +𝟏 − (−𝟏) +𝟏, 𝟏𝟏𝟒, 𝟏𝟓. 𝟐 𝑾 𝟐 = −𝟏𝟐𝟑𝟎, 𝟑𝟎, 𝟑𝟎𝟎 + .01 +𝟐 +𝟏, 𝟏𝟏𝟒, 𝟏𝟓. 𝟐 𝑾 𝟐 = −𝟏𝟐𝟑𝟎, 𝟑𝟎, 𝟑𝟎𝟎 + .0 𝟐 +𝟏, 𝟏𝟏𝟒, 𝟏𝟓. 𝟐 𝑾 𝟐 = −𝟏𝟐𝟑𝟎, 𝟑𝟎, 𝟑𝟎𝟎 + +. 𝟎𝟐, 𝟐. 𝟐𝟖, . 𝟑𝟎𝟒 𝑾 𝟐 = −𝟏𝟐𝟐𝟗. 𝟎𝟖, −𝟐𝟕. 𝟕𝟐, 𝟑𝟎𝟎. 𝟑𝟎𝟒
  • 69. Neural Networks Training Example Step n=2 • In each step in the solution, the parameters of the neural network must be known. • Parameters of step n=2: η = .01 𝑋 𝑛 = 𝑋 2 = +1, 210, 9.4 𝑊 𝑛 = 𝑊 2 = −1229.08, −27.72, 300.304 𝑑 𝑛 = 𝑑 2 = −1 𝑭 𝟐𝑭 𝟏 16.8121 C1 = +1 15.2114 9.4210 C2 = -1 8.1195
  • 70. Neural Networks Training Example Step n=2 𝑭 𝟐𝑭 𝟏 16.8121 C1 = +1 15.2114 9.4210 C2 = -1 8.1195 F(s)s sgn =+1𝑿 𝟎 - 1229. 08 𝑿 𝟏 𝑿 𝟐 210 𝟗. 𝟒 -27.72 300.3 04 C1/C2 𝒀(𝒏)
  • 71. Neural Networks Training Example Step n=2 - SOP s=(𝑿 𝟎 𝑾 𝟎+𝑿 𝟏 𝑾 𝟏+𝑿 𝟐 𝑾 𝟐) =+1*-1229.08+210*- 27.72+9.4*300.304 =-4227.4224 𝑭 𝟐𝑭 𝟏 16.8121 C1 = +1 15.2114 9.4210 C2 = -1 8.1195 F(s)s sgn =+1𝑿 𝟎 - 1229. 08 𝑿 𝟏 𝑿 𝟐 210 𝟗. 𝟒 -27.72 300.3 04 C1/C2 𝒀(𝒏)
  • 72. Neural Networks Training Example Step n=2 - Output 𝒀 𝒏 = 𝒀 𝟐 = 𝑺𝑮𝑵 𝒔 = 𝑺𝑮𝑵 −4227.4224 = +𝟏 𝒔𝒈𝒏 𝒔 = +𝟏, 𝒔 ≥ 𝟎 −𝟏, 𝒔 < 𝟎 𝑭 𝟐𝑭 𝟏 16.8121 C1 = +1 15.2114 9.4210 C2 = -1 8.1195 F(s)s sgn =+1𝑿 𝟎 - 1229. 08 𝑿 𝟏 𝑿 𝟐 210 𝟗. 𝟒 -27.72 300.3 04 C1/C2 𝒀(𝒏)
  • 73. Neural Networks Training Example Step n=2 - Output 𝒀 𝒏 = 𝒀 𝟐 = 𝑺𝑮𝑵 𝒔 = 𝑺𝑮𝑵 −4227.4224 = −𝟏 𝑭 𝟐𝑭 𝟏 16.8121 C1 = +1 15.2114 9.4210 C2 = -1 8.1195 F(s)s sgn =+1𝑿 𝟎 - 1229. 08 𝑿 𝟏 𝑿 𝟐 210 𝟗. 𝟒 -27.72 300.3 04 C2 −𝟏
  • 74. Neural Networks Training Example Step n=2 Predicted Vs. Desired 𝒀 𝒏 = 𝒀 𝟐 = −𝟏 𝐝 𝒏 = 𝒅 𝟐 = −𝟏 ∵ 𝒀 𝒏 = 𝒅 𝒏 ∴ Weights are Correct. No Adaptation 𝑭 𝟐𝑭 𝟏 16.8121 C1 = +1 15.2114 9.4210 C2 = -1 8.1195 F(s)s sgn =+1𝑿 𝟎 - 1229. 08 𝑿 𝟏 𝑿 𝟐 210 𝟗. 𝟒 -27.72 300.3 04 C2 −𝟏
  • 75. Neural Networks Training Example Step n=3 • In each step in the solution, the parameters of the neural network must be known. • Parameters of step n=3: η = .01 𝑋 𝑛 = 𝑋 3 = +1, 210, 9.4 𝑊 𝑛 = 𝑊 3 = 𝑊 2 = −1229.08, −27.72, 300.304 𝑑 𝑛 = 𝑑 3 = −1 𝑭 𝟐𝑭 𝟏 16.8121 C1 = +1 15.2114 9.4210 C2 = -1 8.1195
  • 76. Neural Networks Training Example Step n=3 𝑭 𝟐𝑭 𝟏 16.8121 C1 = +1 15.2114 9.4210 C2 = -1 8.1195 F(s)s sgn =+1𝑿 𝟎 - 1229. 08 𝑿 𝟏 𝑿 𝟐 195 𝟖. 𝟏 -27.72 300.3 04 C1/C2 𝒀(𝒏)
  • 77. Neural Networks Training Example Step n=3 - SOP s=(𝑿 𝟎 𝑾 𝟎+𝑿 𝟏 𝑾 𝟏+𝑿 𝟐 𝑾 𝟐) =+1*-1229.08+195*- 27.72+8.1*300.304 =-4202.0176 𝑭 𝟐𝑭 𝟏 16.8121 C1 = +1 15.2114 9.4210 C2 = -1 8.1195 F(s)s sgn =+1𝑿 𝟎 - 1229. 08 𝑿 𝟏 𝑿 𝟐 195 𝟖. 𝟏 -27.72 300.3 04 C1/C2 𝒀(𝒏)
  • 78. Neural Networks Training Example Step n=3 - Output 𝒀 𝒏 = 𝒀 𝟑 = 𝑺𝑮𝑵 𝒔 = 𝑺𝑮𝑵 −4202.0176 = −𝟏 𝒔𝒈𝒏 𝒔 = +𝟏, 𝒔 ≥ 𝟎 −𝟏, 𝒔 < 𝟎 𝑭 𝟐𝑭 𝟏 16.8121 C1 = +1 15.2114 9.4210 C2 = -1 8.1195 F(s)s sgn =+1𝑿 𝟎 - 1229. 08 𝑿 𝟏 𝑿 𝟐 195 𝟖. 𝟏 -27.72 300.3 04 C1/C2 𝒀(𝒏)
  • 79. Neural Networks Training Example Step n=3 - Output 𝒀 𝒏 = 𝒀 𝟑 = 𝑺𝑮𝑵 𝒔 = 𝑺𝑮𝑵 −4202.0176 = −𝟏 𝑭 𝟐𝑭 𝟏 16.8121 C1 = +1 15.2114 9.4210 C2 = -1 8.1195 F(s)s sgn =+1𝑿 𝟎 - 1229. 08 𝑿 𝟏 𝑿 𝟐 195 𝟖. 𝟏 -27.72 300.3 04 C2 −𝟏
  • 80. Neural Networks Training Example Step n=3 Predicted Vs. Desired 𝒀 𝒏 = 𝒀 𝟑 = −𝟏 𝐝 𝒏 = 𝒅 𝟑 = −𝟏 ∵ 𝒀 𝒏 = 𝒅 𝒏 ∴ Weights are Correct. No Adaptation 𝑭 𝟐𝑭 𝟏 16.8121 C1 = +1 15.2114 9.4210 C2 = -1 8.1195 F(s)s sgn =+1𝑿 𝟎 - 1229. 08 𝑿 𝟏 𝑿 𝟐 195 𝟖. 𝟏 -27.72 300.3 04 C1 −𝟏
  • 81. Neural Networks Training Example Step n=4 • In each step in the solution, the parameters of the neural network must be known. • Parameters of step n=4: η = .01 𝑋 𝑛 = 𝑋 4 = +1, 121, 16.8 𝑊 𝑛 = 𝑊 4 = 𝑊 3 = −1229.08, −27.72, 300.304 𝑑 𝑛 = 𝑑 4 = +1 𝑭 𝟐𝑭 𝟏 16.8121 C1 = +1 15.2114 9.4210 C2 = -1 8.1195
  • 82. Neural Networks Training Example Step n=4 𝑭 𝟐𝑭 𝟏 16.8121 C1 = +1 15.2114 9.4210 C2 = -1 8.1195 F(s)s sgn =+1𝑿 𝟎 - 1229. 08 𝑿 𝟏 𝑿 𝟐 121 𝟏𝟔. 𝟖 -27.72 300.3 04 C1/C2 𝒀(𝒏)
  • 83. Neural Networks Training Example Step n=4 - SOP s=(𝑿 𝟎 𝑾 𝟎+𝑿 𝟏 𝑾 𝟏+𝑿 𝟐 𝑾 𝟐) =+1*-1229.08+121*- 27.72+16.8*300.304 =461.91 𝑭 𝟐𝑭 𝟏 16.8121 C1 = +1 15.2114 9.4210 C2 = -1 8.1195 F(s)s sgn =+1𝑿 𝟎 - 1229. 08 𝑿 𝟏 𝑿 𝟐 121 𝟏𝟔. 𝟖 -27.72 300.3 04 C1/C2 𝒀(𝒏)
  • 84. Neural Networks Training Example Step n=4 - Output 𝒀 𝒏 = 𝒀 𝟒 = 𝑺𝑮𝑵 𝒔 = 𝑺𝑮𝑵 461.91 = +𝟏 𝒔𝒈𝒏 𝒔 = +𝟏, 𝒔 ≥ 𝟎 −𝟏, 𝒔 < 𝟎 𝑭 𝟐𝑭 𝟏 16.8121 C1 = +1 15.2114 9.4210 C2 = -1 8.1195 F(s)s sgn =+1𝑿 𝟎 - 1229. 08 𝑿 𝟏 𝑿 𝟐 121 𝟏𝟔. 𝟖 -27.72 300.3 04 C1/C2 𝒀(𝒏)
  • 85. Neural Networks Training Example Step n=4 - Output 𝒀 𝒏 = 𝒀 𝟒 = 𝑺𝑮𝑵 𝒔 = 𝑺𝑮𝑵 461.91 = +𝟏 𝑭 𝟐𝑭 𝟏 16.8121 C1 = +1 15.2114 9.4210 C2 = -1 8.1195 F(s)s sgn =+1𝑿 𝟎 - 1229. 08 𝑿 𝟏 𝑿 𝟐 121 𝟏𝟔. 𝟖 -27.72 300.3 04 C1 +𝟏
  • 86. Neural Networks Training Example Step n=4 Predicted Vs. Desired 𝒀 𝒏 = 𝒀 𝟒 = +𝟏 𝐝 𝒏 = 𝒅 𝟒 = +𝟏 ∵ 𝒀 𝒏 = 𝒅 𝒏 ∴ Weights are Correct. No Adaptation 𝑭 𝟐𝑭 𝟏 16.8121 C1 = +1 15.2114 9.4210 C2 = -1 8.1195 F(s)s sgn =+1𝑿 𝟎 - 1229. 08 𝑿 𝟏 𝑿 𝟐 121 𝟏𝟔. 𝟖 -27.72 300.3 04 C1 +𝟏
  • 87. Neural Networks Training Example Step n=5 • In each step in the solution, the parameters of the neural network must be known. • Parameters of step n=5: η = .01 𝑋 𝑛 = 𝑋 5 = +1, 114, 15.2 𝑊 𝑛 = 𝑊 5 = 𝑊 4 = −1229.08, −27.72, 300.304 𝑑 𝑛 = 𝑑 5 = +1 𝑭 𝟐𝑭 𝟏 16.8121 C1 = +1 15.2114 9.4210 C2 = -1 8.1195
  • 88. Neural Networks Training Example Step n=5 𝑭 𝟐𝑭 𝟏 16.8121 C1 = +1 15.2114 9.4210 C2 = -1 8.1195 F(s)s sgn =+1𝑿 𝟎 - 1229. 08 𝑿 𝟏 𝑿 𝟐 114 𝟏𝟓. 𝟐 -27.72 300.3 04 C1/C2 𝒀(𝒏)
  • 89. Neural Networks Training Example Step n=5 - SOP s=(𝑿 𝟎 𝑾 𝟎+𝑿 𝟏 𝑾 𝟏+𝑿 𝟐 𝑾 𝟐) =+1*-1229.08+114*- 27.72+15.2*300.304 = 1𝟕𝟓. 𝟒𝟔 𝑭 𝟐𝑭 𝟏 16.8121 C1 = +1 15.2114 9.4210 C2 = -1 8.1195 F(s)s sgn =+1𝑿 𝟎 - 1229. 08 𝑿 𝟏 𝑿 𝟐 114 𝟏𝟓. 𝟐 -27.72 300.3 04 C1/C2 𝒀(𝒏)
  • 90. Neural Networks Training Example Step n=5 - Output 𝒀 𝒏 = 𝒀 𝟓 = 𝑺𝑮𝑵 𝒔 = 𝑺𝑮𝑵 1𝟕𝟓. 𝟒𝟔 = +𝟏 𝒔𝒈𝒏 𝒔 = +𝟏, 𝒔 ≥ 𝟎 −𝟏, 𝒔 < 𝟎 𝑭 𝟐𝑭 𝟏 16.8121 C1 = +1 15.2114 9.4210 C2 = -1 8.1195 F(s)s sgn =+1𝑿 𝟎 - 1229. 08 𝑿 𝟏 𝑿 𝟐 114 𝟏𝟓. 𝟐 -27.72 300.3 04 C1/C2 𝒀(𝒏)
  • 91. Neural Networks Training Example Step n=5 - Output 𝒀 𝒏 = 𝒀 𝟓 = 𝑺𝑮𝑵 𝒔 = 𝑺𝑮𝑵 1𝟕𝟓. 𝟒𝟔 = +𝟏 𝑭 𝟐𝑭 𝟏 16.8121 C1 = +1 15.2114 9.4210 C2 = -1 8.1195 F(s)s sgn =+1𝑿 𝟎 - 1229. 08 𝑿 𝟏 𝑿 𝟐 114 𝟏𝟓. 𝟐 -27.72 300.3 04 C1 +𝟏
  • 92. Neural Networks Training Example Step n=5 Predicted Vs. Desired 𝒀 𝒏 = 𝒀 𝟒 = +𝟏 𝐝 𝒏 = 𝒅 𝟒 = +𝟏 ∵ 𝒀 𝒏 = 𝒅 𝒏 ∴ Weights are Correct. No Adaptation 𝑭 𝟐𝑭 𝟏 16.8121 C1 = +1 15.2114 9.4210 C2 = -1 8.1195 F(s)s sgn =+1𝑿 𝟎 - 1229. 08 𝑿 𝟏 𝑿 𝟐 114 𝟏𝟓. 𝟐 -27.72 300.3 04 C1 +𝟏
  • 93. Correct Weights • After testing the weights across all samples and results were correct then we can conclude that current weights are correct ones for training the neural network. • After training phase we come to testing the neural network. • What is the class of the unknown color of values of F1=140 and F2=17.9?
  • 94. Testing Trained Neural Network (F1, F2) = (140, 17.9) Trained Neural Networks Parameters 𝑊 = −1229.08, −27.72, 300.304
  • 95. Testing Trained Neural Network (F1, F2) = (140, 17.9) SOP F(s)s sgn =+1𝑿 𝟎 - 1229. 08 𝑿 𝟏 𝑿 𝟐 140 𝟏𝟕. 𝟗 -27.72 300.3 04 C1/C2 𝒀(𝒏) s=(𝑿 𝟎 𝑾 𝟎+𝑿 𝟏 𝑾 𝟏+𝑿 𝟐 𝑾 𝟐) =+1*-1229.08+140*- 27.72+17.9*300.304 = 𝟐𝟔𝟓
  • 96. Testing Trained Neural Network (F1, F2) = (140, 17.9) Output 𝒀 = 𝑺𝑮𝑵 𝒔 = 𝑺𝑮𝑵 𝟐𝟔𝟓 = +𝟏 𝒔𝒈𝒏 𝒔 = +𝟏, 𝒔 ≥ 𝟎 −𝟏, 𝒔 < 𝟎 F(s)s sgn =+1𝑿 𝟎 - 1229. 08 𝑿 𝟏 𝑿 𝟐 140 𝟏𝟕. 𝟗 -27.72 300.3 04 C1/C2 𝒀(𝒏)
  • 97. Testing Trained Neural Network (F1, F2) = (140, 17.9) Output 𝒀 = 𝑺𝑮𝑵 𝒔 = 𝑺𝑮𝑵 𝟐𝟔𝟓 = +𝟏 F(s)s sgn =+1𝑿 𝟎 - 1229. 08 𝑿 𝟏 𝑿 𝟐 140 𝟏𝟕. 𝟗 -27.72 300.3 04 C1 +1