Successfully reported this slideshow.
We use your LinkedIn profile and activity data to personalize ads and to show you more relevant ads. You can change your ad preferences anytime.

Introduction to Artificial Neural Networks (ANNs) - Step-by-Step Training & Testing Example

This is an introduction to artificial neural networks (ANNs) including the idea of classification and how ANNs can classify data into number of distinct classes based on some features.
A basic neural network example is given that uses a single layer perceptron with three inputs and one output to classify data linearly using the Signum activation function.
The presented example is about classifying data about colors into two categories (Red and Blue).
Artificial neural networks (ANNs) or connectionist systems are a computational model used in machine learning, computer science and other research disciplines, which is based on a large collection of connected simple units called artificial neurons, loosely analogous to axons in a biological brain. Connections between neurons carry an activation signal of varying strength. If the combined incoming signals are strong enough, the neuron becomes activated and the signal travels to other neurons connected to it. Such systems can be trained from examples, rather than explicitly programmed, and excel in areas where the solution or feature detection is difficult to express in a traditional computer program. Like other machine learning methods, neural networks have been used to solve a wide variety of tasks, like computer vision and speech recognition, that are difficult to solve using ordinary rule-based programming.

Find me on:
AFCIT
http://www.afcit.xyz

YouTube
https://www.youtube.com/channel/UCuewOYbBXH5gwhfOrQOZOdw

Google Plus
https://plus.google.com/u/0/+AhmedGadIT

SlideShare
https://www.slideshare.net/AhmedGadFCIT

LinkedIn
https://www.linkedin.com/in/ahmedfgad/

ResearchGate
https://www.researchgate.net/profile/Ahmed_Gad13

Academia
https://www.academia.edu/

Google Scholar
https://scholar.google.com.eg/citations?user=r07tjocAAAAJ&hl=en

Mendelay
https://www.mendeley.com/profiles/ahmed-gad12/

ORCID
https://orcid.org/0000-0003-1978-8574

StackOverFlow
http://stackoverflow.com/users/5426539/ahmed-gad

Twitter
https://twitter.com/ahmedfgad

Facebook
https://www.facebook.com/ahmed.f.gadd

Pinterest
https://www.pinterest.com/ahmedfgad/

Introduction to Artificial Neural Networks (ANNs) - Step-by-Step Training & Testing Example

  1. 1. Artificial Neural Networks (ANNs) Step-By-Step Training & Testing Example MENOUFIA UNIVERSITY FACULTY OF COMPUTERS AND INFORMATION ALL DEPARTMENTS ARTIFICIAL INTELLIGENCE ‫المنوفية‬ ‫جامعة‬ ‫والمعلومات‬ ‫الحاسبات‬ ‫كلية‬ ‫األقسام‬ ‫جميع‬ ‫الذكاء‬‫اإلصطناعي‬ ‫المنوفية‬ ‫جامعة‬ Ahmed Fawzy Gad ahmed.fawzy@ci.menofia.edu.eg
  2. 2. Neural Networks & Classification
  3. 3. Linear Classifiers
  4. 4. Linear Classifiers
  5. 5. Linear Classifiers
  6. 6. Linear Classifiers Complex Data
  7. 7. Linear Classifiers Complex Data
  8. 8. Linear Classifiers Complex Data
  9. 9. Not Solved Linearly
  10. 10. Not Solved Linearly
  11. 11. Nonlinear Classifiers
  12. 12. Nonlinear Classifiers Training
  13. 13. Nonlinear Classifiers Training
  14. 14. Nonlinear Classifiers Training
  15. 15. Classification Example B (BLUE)G (GREEN)R (RED) 00255 RED 6880248 25500 BLUE 2101567
  16. 16. Neural Networks B (BLUE)G (GREEN)R (RED) 00255 RED 6880248 25500 BLUE 2101567 Input Hidden Output
  17. 17. Neural Networks B (BLUE)G (GREEN)R (RED) 00255 RED 6880248 25500 BLUE 2101567
  18. 18. Input Layer B (BLUE)G (GREEN)R (RED) 00255 RED 6880248 25500 BLUE 2101567 Input Output R G B
  19. 19. Output Layer B (BLUE)G (GREEN)R (RED) 00255 RED 6880248 25500 BLUE 2101567 Input Output R G B RED/BLUE 𝒀𝒋
  20. 20. Weights B (BLUE)G (GREEN)R (RED) 00255 RED 6880248 25500 BLUE 2101567 Input Output R G B RED/BLUE 𝑾 𝟏 𝑾 𝟐 𝑾 𝟑 Weights=𝑾𝒊 𝒀𝒋
  21. 21. Activation Function B (BLUE)G (GREEN)R (RED) 00255 RED 6880248 25500 BLUE 2101567 Input Output R G B RED/BLUE W1 W2 W3 𝒀𝒋
  22. 22. Activation Function B (BLUE)G (GREEN)R (RED) 00255 RED 6880248 25500 BLUE 2101567 Input Output R G B RED/BLUE W1 W2 W3 𝒀𝒋
  23. 23. Activation Function B (BLUE)G (GREEN)R (RED) 00255 RED 6880248 25500 BLUE 2101567 Input Output R G B RED/BLUE W1 W2 W3 𝒀𝒋
  24. 24. Activation Function Components B (BLUE)G (GREEN)R (RED) 00255 RED 6880248 25500 BLUE 2101567 Input Output R G B RED/BLUE W1 W2 W3 𝒀𝒋
  25. 25. Activation Function Inputs B (BLUE)G (GREEN)R (RED) 00255 RED 6880248 25500 BLUE 2101567 Input Output R G B RED/BLUE W1 W2 W3 s 𝒀𝒋
  26. 26. Activation Function Inputs B (BLUE)G (GREEN)R (RED) 00255 RED 6880248 25500 BLUE 2101567 Input Output R G B RED/BLUE W1 W2 W3 ss=SOP(𝑿𝒊, 𝑾𝒊) 𝒀𝒋
  27. 27. Activation Function Inputs B (BLUE)G (GREEN)R (RED) 00255 RED 6880248 25500 BLUE 2101567 Input Output R G B RED/BLUE W1 W2 W3 ss=SOP(𝑿𝒊, 𝑾𝒊) 𝑿𝒊=Inputs 𝑾𝒊=Weights 𝒀𝒋
  28. 28. Activation Function Inputs B (BLUE)G (GREEN)R (RED) 00255 RED 6880248 25500 BLUE 2101567 Input Output R G B RED/BLUE W1 W2 W3 ss=SOP(𝑿𝒊, 𝑾𝒊) 𝑿𝒊=Inputs 𝑾𝒊=Weights 𝑿 𝟏 𝑿 𝟐 𝑿 𝟑 𝒀𝒋
  29. 29. Activation Function Inputs B (BLUE)G (GREEN)R (RED) 00255 RED 6880248 25500 BLUE 2101567 Input Output R G B RED/BLUE W1 W2 W3 ss=SOP(𝑿𝒊, 𝑾𝒊) 𝑿𝒊=Inputs 𝑾𝒊=Weights 𝑿 𝟏 𝑿 𝟐 𝑿 𝟑 S= 𝟏 𝒎 𝑿𝒊 𝑾𝒊 𝒀𝒋
  30. 30. Activation Function Inputs B (BLUE)G (GREEN)R (RED) 00255 RED 6880248 25500 BLUE 2101567 Input Output R G B RED/BLUE W1 W2 W3 ss=SOP(𝑿𝒊, 𝑾𝒊) 𝑿𝒊=Inputs 𝑾𝒊=Weights 𝑿 𝟏 𝑿 𝟐 𝑿 𝟑 s=(𝑿 𝟏 𝑾 𝟏+𝑿 𝟐 𝑾 𝟐+𝑿 𝟑 𝑾 𝟑) S= 𝟏 𝒎 𝑿𝒊 𝑾𝒊 𝒀𝒋
  31. 31. Activation Function Outputs B (BLUE)G (GREEN)R (RED) 00255 RED 6880248 25500 BLUE 2101567 Input Output R G B RED/BLUE W1 W2 W3 F(s)s 𝑿 𝟏 𝑿 𝟐 𝑿 𝟑 Class Label 𝒀𝒋
  32. 32. Activation Functions Piecewise Linear Sigmoid Signum
  33. 33. Activation Functions Which activation function to use? B (BLUE)G (GREEN)R (RED) 00255 RED 6880248 25500 BLUE 2101567 Outputs Class Labels Activation Function TWO Class Labels TWO Outputs One that gives two outputs. B (BLUE)G (GREEN)R (RED) 00255 RED 6880248 25500 BLUE 2101567 Which activation function to use? 𝑪𝒋𝒀𝒋
  34. 34. Activation Functions Piecewise Linear Sigmoid SignumSignum
  35. 35. Activation Function B (BLUE)G (GREEN)R (RED) 00255 RED 6880248 25500 BLUE 2101567 Input Output R G B RED/BLUE W1 W2 W3 F(s)s 𝑿 𝟏 𝑿 𝟐 𝑿 𝟑 sgn 𝒀𝒋
  36. 36. Bias B (BLUE)G (GREEN)R (RED) 00255 RED 6880248 25500 BLUE 2101567 Input Output R G B RED/BLUE W1 W2 W3 F(s)s 𝑿 𝟏 𝑿 𝟐 𝑿 𝟑 sgn 𝒀𝒋 =+1𝑿 𝟎 W0 s=(𝑿 𝟏 𝑾 𝟏+𝑿 𝟐 𝑾 𝟐+𝑿 𝟑 𝑾 𝟑)
  37. 37. Bias B (BLUE)G (GREEN)R (RED) 00255 RED 6880248 25500 BLUE 2101567 Input Output R G B RED/BLUE W1 W2 W3 F(s)s 𝑿 𝟏 𝑿 𝟐 𝑿 𝟑 sgn 𝒀𝒋 =+1𝑿 𝟎 W0 s=(𝑿 𝟏 𝑾 𝟏+𝑿 𝟐 𝑾 𝟐+𝑿 𝟑 𝑾 𝟑) s=(𝑿 𝟎 𝑾 𝟎+𝑿 𝟏 𝑾 𝟏+𝑿 𝟐 𝑾 𝟐+𝑿 𝟑 𝑾 𝟑)
  38. 38. Bias B (BLUE)G (GREEN)R (RED) 00255 RED 6880248 25500 BLUE 2101567 Input Output R G B RED/BLUE W1 W2 W3 F(s)s 𝑿 𝟏 𝑿 𝟐 𝑿 𝟑 sgn 𝒀𝒋 =+1𝑿 𝟎 W0 s=(𝑿 𝟏 𝑾 𝟏+𝑿 𝟐 𝑾 𝟐+𝑿 𝟑 𝑾 𝟑) s=(𝑿 𝟎 𝑾 𝟎+𝑿 𝟏 𝑾 𝟏+𝑿 𝟐 𝑾 𝟐+𝑿 𝟑 𝑾 𝟑) 𝑿 𝟎 = +𝟏
  39. 39. Bias B (BLUE)G (GREEN)R (RED) 00255 RED 6880248 25500 BLUE 2101567 Input Output R G B RED/BLUE W1 W2 W3 F(s)s 𝑿 𝟏 𝑿 𝟐 𝑿 𝟑 sgn 𝒀𝒋 =+1𝑿 𝟎 W0 s=(𝑿 𝟏 𝑾 𝟏+𝑿 𝟐 𝑾 𝟐+𝑿 𝟑 𝑾 𝟑) s=(𝑿 𝟎 𝑾 𝟎+𝑿 𝟏 𝑾 𝟏+𝑿 𝟐 𝑾 𝟐+𝑿 𝟑 𝑾 𝟑) s=(+𝟏 ∗ 𝑾 𝟎+𝑿 𝟏 𝑾 𝟏+𝑿 𝟐 𝑾 𝟐+𝑿 𝟑 𝑾 𝟑)
  40. 40. Bias B (BLUE)G (GREEN)R (RED) 00255 RED 6880248 25500 BLUE 2101567 Input Output R G B RED/BLUE W1 W2 W3 F(s)s 𝑿 𝟏 𝑿 𝟐 𝑿 𝟑 sgn 𝒀𝒋 =+1𝑿 𝟎 W0 s=(𝑿 𝟏 𝑾 𝟏+𝑿 𝟐 𝑾 𝟐+𝑿 𝟑 𝑾 𝟑) s=(𝑿 𝟎 𝑾 𝟎+𝑿 𝟏 𝑾 𝟏+𝑿 𝟐 𝑾 𝟐+𝑿 𝟑 𝑾 𝟑) s=(𝑾 𝟎+𝑿 𝟏 𝑾 𝟏+𝑿 𝟐 𝑾 𝟐+𝑿 𝟑 𝑾 𝟑)
  41. 41. Bias Importance B (BLUE)G (GREEN)R (RED) 00255 RED 6880248 25500 BLUE 2101567 Input Output R G B RED/BLUE W1 W2 W3 F(s)s 𝑿 𝟏 𝑿 𝟐 𝑿 𝟑 sgn 𝒀𝒋 =+1𝑿 𝟎 W0 s=(𝑿 𝟏 𝑾 𝟏+𝑿 𝟐 𝑾 𝟐+𝑿 𝟑 𝑾 𝟑) s=(𝑾 𝟎+𝑿 𝟏 𝑾 𝟏+𝑿 𝟐 𝑾 𝟐+𝑿 𝟑 𝑾 𝟑)
  42. 42. Bias Importance B (BLUE)G (GREEN)R (RED) 00255 RED 6880248 25500 BLUE 2101567 Input Output R G B RED/BLUE W1 W2 W3 F(s)s 𝑿 𝟏 𝑿 𝟐 𝑿 𝟑 sgn 𝒀𝒋 =+1𝑿 𝟎 W0 s=(𝑿 𝟏 𝑾 𝟏+𝑿 𝟐 𝑾 𝟐+𝑿 𝟑 𝑾 𝟑) s=(𝑾 𝟎+𝑿 𝟏 𝑾 𝟏+𝑿 𝟐 𝑾 𝟐+𝑿 𝟑 𝑾 𝟑) X Y
  43. 43. Bias Importance B (BLUE)G (GREEN)R (RED) 00255 RED 6880248 25500 BLUE 2101567 Input Output R G B RED/BLUE W1 W2 W3 F(s)s 𝑿 𝟏 𝑿 𝟐 𝑿 𝟑 sgn 𝒀𝒋 =+1𝑿 𝟎 W0 s=(𝑿 𝟏 𝑾 𝟏+𝑿 𝟐 𝑾 𝟐+𝑿 𝟑 𝑾 𝟑) s=(𝑾 𝟎+𝑿 𝟏 𝑾 𝟏+𝑿 𝟐 𝑾 𝟐+𝑿 𝟑 𝑾 𝟑) X Y y=x+b
  44. 44. Bias Importance B (BLUE)G (GREEN)R (RED) 00255 RED 6880248 25500 BLUE 2101567 Input Output R G B RED/BLUE W1 W2 W3 F(s)s 𝑿 𝟏 𝑿 𝟐 𝑿 𝟑 sgn 𝒀𝒋 =+1𝑿 𝟎 W0 s=(𝑿 𝟏 𝑾 𝟏+𝑿 𝟐 𝑾 𝟐+𝑿 𝟑 𝑾 𝟑) s=(𝑾 𝟎+𝑿 𝟏 𝑾 𝟏+𝑿 𝟐 𝑾 𝟐+𝑿 𝟑 𝑾 𝟑) X Y y=x+b
  45. 45. Bias Importance B (BLUE)G (GREEN)R (RED) 00255 RED 6880248 25500 BLUE 2101567 Input Output R G B RED/BLUE W1 W2 W3 F(s)s 𝑿 𝟏 𝑿 𝟐 𝑿 𝟑 sgn 𝒀𝒋 =+1𝑿 𝟎 W0 s=(𝑿 𝟏 𝑾 𝟏+𝑿 𝟐 𝑾 𝟐+𝑿 𝟑 𝑾 𝟑) s=(𝑾 𝟎+𝑿 𝟏 𝑾 𝟏+𝑿 𝟐 𝑾 𝟐+𝑿 𝟑 𝑾 𝟑) X Y y=x+b Y-Intercept
  46. 46. Bias Importance B (BLUE)G (GREEN)R (RED) 00255 RED 6880248 25500 BLUE 2101567 Input Output R G B RED/BLUE W1 W2 W3 F(s)s 𝑿 𝟏 𝑿 𝟐 𝑿 𝟑 sgn 𝒀𝒋 =+1𝑿 𝟎 W0 s=(𝑿 𝟏 𝑾 𝟏+𝑿 𝟐 𝑾 𝟐+𝑿 𝟑 𝑾 𝟑) s=(𝑾 𝟎+𝑿 𝟏 𝑾 𝟏+𝑿 𝟐 𝑾 𝟐+𝑿 𝟑 𝑾 𝟑) X Y y=x+b Y-Intercept b=0
  47. 47. Bias Importance B (BLUE)G (GREEN)R (RED) 00255 RED 6880248 25500 BLUE 2101567 Input Output R G B RED/BLUE W1 W2 W3 F(s)s 𝑿 𝟏 𝑿 𝟐 𝑿 𝟑 sgn 𝒀𝒋 =+1𝑿 𝟎 W0 s=(𝑿 𝟏 𝑾 𝟏+𝑿 𝟐 𝑾 𝟐+𝑿 𝟑 𝑾 𝟑) s=(𝑾 𝟎+𝑿 𝟏 𝑾 𝟏+𝑿 𝟐 𝑾 𝟐+𝑿 𝟑 𝑾 𝟑) X Y y=x+b Y-Intercept b=0
  48. 48. Bias Importance B (BLUE)G (GREEN)R (RED) 00255 RED 6880248 25500 BLUE 2101567 Input Output R G B RED/BLUE W1 W2 W3 F(s)s 𝑿 𝟏 𝑿 𝟐 𝑿 𝟑 sgn 𝒀𝒋 =+1𝑿 𝟎 W0 s=(𝑿 𝟏 𝑾 𝟏+𝑿 𝟐 𝑾 𝟐+𝑿 𝟑 𝑾 𝟑) s=(𝑾 𝟎+𝑿 𝟏 𝑾 𝟏+𝑿 𝟐 𝑾 𝟐+𝑿 𝟑 𝑾 𝟑) X Y y=x+b Y-Intercept b=+v
  49. 49. Bias Importance B (BLUE)G (GREEN)R (RED) 00255 RED 6880248 25500 BLUE 2101567 Input Output R G B RED/BLUE W1 W2 W3 F(s)s 𝑿 𝟏 𝑿 𝟐 𝑿 𝟑 sgn 𝒀𝒋 =+1𝑿 𝟎 W0 s=(𝑿 𝟏 𝑾 𝟏+𝑿 𝟐 𝑾 𝟐+𝑿 𝟑 𝑾 𝟑) s=(𝑾 𝟎+𝑿 𝟏 𝑾 𝟏+𝑿 𝟐 𝑾 𝟐+𝑿 𝟑 𝑾 𝟑) X Y y=x+b Y-Intercept b=+v
  50. 50. Bias Importance B (BLUE)G (GREEN)R (RED) 00255 RED 6880248 25500 BLUE 2101567 Input Output R G B RED/BLUE W1 W2 W3 F(s)s 𝑿 𝟏 𝑿 𝟐 𝑿 𝟑 sgn 𝒀𝒋 =+1𝑿 𝟎 W0 s=(𝑿 𝟏 𝑾 𝟏+𝑿 𝟐 𝑾 𝟐+𝑿 𝟑 𝑾 𝟑) s=(𝑾 𝟎+𝑿 𝟏 𝑾 𝟏+𝑿 𝟐 𝑾 𝟐+𝑿 𝟑 𝑾 𝟑) X Y y=x+b Y-Intercept b=-v
  51. 51. Bias Importance B (BLUE)G (GREEN)R (RED) 00255 RED 6880248 25500 BLUE 2101567 Input Output R G B RED/BLUE W1 W2 W3 F(s)s 𝑿 𝟏 𝑿 𝟐 𝑿 𝟑 sgn 𝒀𝒋 =+1𝑿 𝟎 W0 s=(𝑿 𝟏 𝑾 𝟏+𝑿 𝟐 𝑾 𝟐+𝑿 𝟑 𝑾 𝟑) s=(𝑾 𝟎+𝑿 𝟏 𝑾 𝟏+𝑿 𝟐 𝑾 𝟐+𝑿 𝟑 𝑾 𝟑) X Y y=x+b Y-Intercept b=+v
  52. 52. Bias Importance B (BLUE)G (GREEN)R (RED) 00255 RED 6880248 25500 BLUE 2101567 Input Output R G B RED/BLUE W1 W2 W3 F(s)s 𝑿 𝟏 𝑿 𝟐 𝑿 𝟑 sgn 𝒀𝒋 =+1𝑿 𝟎 W0 s=(𝑿 𝟏 𝑾 𝟏+𝑿 𝟐 𝑾 𝟐+𝑿 𝟑 𝑾 𝟑) s=(𝑾 𝟎+𝑿 𝟏 𝑾 𝟏+𝑿 𝟐 𝑾 𝟐+𝑿 𝟑 𝑾 𝟑) Same Concept Applies to Bias S= 𝟏 𝒎 𝑿𝒊 𝑾𝒊+BIAS
  53. 53. Bias Importance B (BLUE)G (GREEN)R (RED) 00255 RED 6880248 25500 BLUE 2101567 Input Output R G B RED/BLUE W1 W2 W3 F(s)s 𝑿 𝟏 𝑿 𝟐 𝑿 𝟑 sgn 𝒀𝒋 =+1𝑿 𝟎 W0 s=(𝑿 𝟏 𝑾 𝟏+𝑿 𝟐 𝑾 𝟐+𝑿 𝟑 𝑾 𝟑) s=(𝑾 𝟎+𝑿 𝟏 𝑾 𝟏+𝑿 𝟐 𝑾 𝟐+𝑿 𝟑 𝑾 𝟑) S= 𝟏 𝒎 𝑿𝒊 𝑾𝒊+BIAS
  54. 54. Bias Importance B (BLUE)G (GREEN)R (RED) 00255 RED 6880248 25500 BLUE 2101567 Input Output R G B RED/BLUE W1 W2 W3 F(s)s 𝑿 𝟏 𝑿 𝟐 𝑿 𝟑 sgn 𝒀𝒋 =+1𝑿 𝟎 W0 s=(𝑿 𝟏 𝑾 𝟏+𝑿 𝟐 𝑾 𝟐+𝑿 𝟑 𝑾 𝟑) s=(𝑾 𝟎+𝑿 𝟏 𝑾 𝟏+𝑿 𝟐 𝑾 𝟐+𝑿 𝟑 𝑾 𝟑) S= 𝟏 𝒎 𝑿𝒊 𝑾𝒊+BIAS
  55. 55. Bias Importance B (BLUE)G (GREEN)R (RED) 00255 RED 6880248 25500 BLUE 2101567 Input Output R G B RED/BLUE W1 W2 W3 F(s)s 𝑿 𝟏 𝑿 𝟐 𝑿 𝟑 sgn 𝒀𝒋 =+1𝑿 𝟎 W0 s=(𝑿 𝟏 𝑾 𝟏+𝑿 𝟐 𝑾 𝟐+𝑿 𝟑 𝑾 𝟑) s=(𝑾 𝟎+𝑿 𝟏 𝑾 𝟏+𝑿 𝟐 𝑾 𝟐+𝑿 𝟑 𝑾 𝟑) S= 𝟏 𝒎 𝑿𝒊 𝑾𝒊+BIAS
  56. 56. Bias Importance B (BLUE)G (GREEN)R (RED) 00255 RED 6880248 25500 BLUE 2101567 Input Output R G B RED/BLUE W1 W2 W3 F(s)s 𝑿 𝟏 𝑿 𝟐 𝑿 𝟑 sgn 𝒀𝒋 =+1𝑿 𝟎 W0 s=(𝑿 𝟏 𝑾 𝟏+𝑿 𝟐 𝑾 𝟐+𝑿 𝟑 𝑾 𝟑) s=(𝑾 𝟎+𝑿 𝟏 𝑾 𝟏+𝑿 𝟐 𝑾 𝟐+𝑿 𝟑 𝑾 𝟑) S= 𝟏 𝒎 𝑿𝒊 𝑾𝒊+BIAS
  57. 57. Learning Rate R G B RED/BLUE W1 W2 W3 𝑿 𝟏 𝑿 𝟐 𝑿 𝟑 F(s)s sgn 𝒀𝒋 =+1𝑿 𝟎 W0 s=(𝑿 𝟎 𝑾 𝟎+𝑿 𝟏 𝑾 𝟏+𝑿 𝟐 𝑾 𝟐+𝑿 𝟑 𝑾 𝟑) 𝟎 ≤ η ≤ 𝟏
  58. 58. Summary of Parameters Inputs 𝑿 𝒎 R G B RED/BLUE W1 W2 W3 𝑿 𝟏 𝑿 𝟐 𝑿 𝟑 F(s)s sgn 𝒀𝒋 =+1𝑿 𝟎 W0 s=(𝑿 𝟎 𝑾 𝟎+𝑿 𝟏 𝑾 𝟏+𝑿 𝟐 𝑾 𝟐+𝑿 𝟑 𝑾 𝟑) 𝟎 ≤ η ≤ 𝟏 𝑿(𝒏)=(𝑿 𝟎, 𝑿 𝟏,𝑿 𝟐, 𝑿 𝟑)
  59. 59. Summary of Parameters Weights 𝑾 𝒎 R G B RED/BLUE W1 W2 W3 𝑿 𝟏 𝑿 𝟐 𝑿 𝟑 F(s)s sgn 𝒀𝒋 =+1𝑿 𝟎 W0 s=(𝑿 𝟎 𝑾 𝟎+𝑿 𝟏 𝑾 𝟏+𝑿 𝟐 𝑾 𝟐+𝑿 𝟑 𝑾 𝟑) 𝟎 ≤ η ≤ 𝟏 W(𝒏)=(𝑾 𝟎, 𝑾 𝟏,𝑾 𝟐, 𝑾 𝟑) 𝑿(𝒏)=(𝑿 𝟎, 𝑿 𝟏,𝑿 𝟐, 𝑿 𝟑)
  60. 60. Summary of Parameters Bias 𝒃 R G B RED/BLUE W1 W2 W3 𝑿 𝟏 𝑿 𝟐 𝑿 𝟑 F(s)s sgn 𝒀𝒋 =+1𝑿 𝟎 W0 s=(𝑿 𝟎 𝑾 𝟎+𝑿 𝟏 𝑾 𝟏+𝑿 𝟐 𝑾 𝟐+𝑿 𝟑 𝑾 𝟑) 𝟎 ≤ η ≤ 𝟏 𝑿(𝒏)=(𝑿 𝟎, 𝑿 𝟏,𝑿 𝟐, 𝑿 𝟑) W(𝒏)=(𝑾 𝟎, 𝑾 𝟏,𝑾 𝟐, 𝑾 𝟑)
  61. 61. Summary of Parameters Sum Of Products (SOP) 𝒔 R G B RED/BLUE W1 W2 W3 𝑿 𝟏 𝑿 𝟐 𝑿 𝟑 F(s)s sgn 𝒀𝒋 =+1𝑿 𝟎 W0 s=(𝑿 𝟎 𝑾 𝟎+𝑿 𝟏 𝑾 𝟏+𝑿 𝟐 𝑾 𝟐+𝑿 𝟑 𝑾 𝟑) 𝟎 ≤ η ≤ 𝟏 𝑿(𝒏)=(𝑿 𝟎, 𝑿 𝟏,𝑿 𝟐, 𝑿 𝟑) W(𝒏)=(𝑾 𝟎, 𝑾 𝟏,𝑾 𝟐, 𝑾 𝟑)
  62. 62. Summary of Parameters Activation Function 𝒔𝒈𝒏 R G B RED/BLUE W1 W2 W3 𝑿 𝟏 𝑿 𝟐 𝑿 𝟑 F(s)s sgn 𝒀𝒋 =+1𝑿 𝟎 W0 s=(𝑿 𝟎 𝑾 𝟎+𝑿 𝟏 𝑾 𝟏+𝑿 𝟐 𝑾 𝟐+𝑿 𝟑 𝑾 𝟑) 𝟎 ≤ η ≤ 𝟏 𝑿(𝒏)=(𝑿 𝟎, 𝑿 𝟏,𝑿 𝟐, 𝑿 𝟑) W(𝒏)=(𝑾 𝟎, 𝑾 𝟏,𝑾 𝟐, 𝑾 𝟑)
  63. 63. Summary of Parameters Outputs 𝒀𝒋 R G B RED/BLUE W1 W2 W3 𝑿 𝟏 𝑿 𝟐 𝑿 𝟑 F(s)s sgn 𝒀𝒋 =+1𝑿 𝟎 W0 s=(𝑿 𝟎 𝑾 𝟎+𝑿 𝟏 𝑾 𝟏+𝑿 𝟐 𝑾 𝟐+𝑿 𝟑 𝑾 𝟑) 𝟎 ≤ η ≤ 𝟏 𝑿(𝒏)=(𝑿 𝟎, 𝑿 𝟏,𝑿 𝟐, 𝑿 𝟑) W(𝒏)=(𝑾 𝟎, 𝑾 𝟏,𝑾 𝟐, 𝑾 𝟑)
  64. 64. Summary of Parameters Learning Rate η R G B RED/BLUE W1 W2 W3 𝑿 𝟏 𝑿 𝟐 𝑿 𝟑 F(s)s sgn 𝒀𝒋 =+1𝑿 𝟎 W0 s=(𝑿 𝟎 𝑾 𝟎+𝑿 𝟏 𝑾 𝟏+𝑿 𝟐 𝑾 𝟐+𝑿 𝟑 𝑾 𝟑) 𝟎 ≤ η ≤ 𝟏 𝑿(𝒏)=(𝑿 𝟎, 𝑿 𝟏,𝑿 𝟐, 𝑿 𝟑) W(𝒏)=(𝑾 𝟎, 𝑾 𝟏,𝑾 𝟐, 𝑾 𝟑)
  65. 65. Other Parameters Step n R G B RED/BLUE W1 W2 W3 𝑿 𝟏 𝑿 𝟐 𝑿 𝟑 F(s)s sgn 𝒀𝒋 =+1𝑿 𝟎 W0 s=(𝑿 𝟎 𝑾 𝟎+𝑿 𝟏 𝑾 𝟏+𝑿 𝟐 𝑾 𝟐+𝑿 𝟑 𝑾 𝟑) 𝟎 ≤ η ≤ 𝟏𝒏 = 𝟎, 𝟏, 𝟐, … 𝑿(𝒏)=(𝑿 𝟎, 𝑿 𝟏,𝑿 𝟐, 𝑿 𝟑) W(𝒏)=(𝑾 𝟎, 𝑾 𝟏,𝑾 𝟐, 𝑾 𝟑)
  66. 66. Other Parameters Desired Output 𝒅𝒋 R G B RED/BLUE W1 W2 W3 𝑿 𝟏 𝑿 𝟐 𝑿 𝟑 F(s)s sgn 𝒀𝒋 =+1𝑿 𝟎 W0 s=(𝑿 𝟎 𝑾 𝟎+𝑿 𝟏 𝑾 𝟏+𝑿 𝟐 𝑾 𝟐+𝑿 𝟑 𝑾 𝟑) 𝟎 ≤ η ≤ 𝟏𝒏 = 𝟎, 𝟏, 𝟐, … B (BLUE)G (GREEN)R (RED) 00255 RED = -1 6880248 25500 BLUE = +1 2101567 𝒅 𝒏 = −𝟏, 𝒙 𝒏 𝒃𝒆𝒍𝒐𝒏𝒈𝒔 𝒕𝒐 𝑪𝟏 (𝑹𝑬𝑫) +𝟏, 𝒙 𝒏 𝒃𝒆𝒍𝒐𝒏𝒈𝒔 𝒕𝒐 𝑪𝟐 (𝑩𝑳𝑼𝑬) 𝑿(𝒏)=(𝑿 𝟎, 𝑿 𝟏,𝑿 𝟐, 𝑿 𝟑) W(𝒏)=(𝑾 𝟎, 𝑾 𝟏,𝑾 𝟐, 𝑾 𝟑)
  67. 67. Neural Networks Training Steps Weights Initialization Inputs Application Sum of Inputs-Weights Products Activation Function Response Calculation Weights Adaptation Back to Step 2 1 2 3 4 5 6
  68. 68. Regarding 5th Step: Weights Adaptation • If the predicted output Y is not the same as the desired output d, then weights are to be adapted according to the following equation: 𝑾 𝒏 + 𝟏 = 𝑾 𝒏 + η 𝒅 𝒏 − 𝒀 𝒏 𝑿(𝒏) Where 𝑾 𝒏 = [𝒃 𝒏 , 𝑾 𝟏(𝒏), 𝑾 𝟐(𝒏), 𝑾 𝟑(𝒏), … , 𝑾 𝒎(𝒏)]
  69. 69. Neural Networks Training Example Step n=0 • In each step in the solution, the parameters of the neural network must be known. • Parameters of step n=0: η = .001 𝑋 𝑛 = 𝑋 0 = +1, 255, 0, 0 𝑊 𝑛 = 𝑊 0 = −1, −2, 1, 6.2 𝑑 𝑛 = 𝑑 0 = −1 B (BLUE)G (GREEN)R (RED) 00255 RED = -1 6880248 25500 BLUE = +1 2101567
  70. 70. Neural Networks Training Example Step n=0 255 0 0 -2 1 6.2 𝑿 𝟏 𝑿 𝟐 𝑿 𝟑 F(s)s sgn =+1𝑿 𝟎 -1 RED/BLUE 𝒀(𝒏) B (BLUE)G (GREEN)R (RED) 00255 RED = -1 6880248 25500 BLUE = +1 2101567
  71. 71. Neural Networks Training Example Step n=0 - SOP 255 0 0 -2 1 6.2 𝑿 𝟏 𝑿 𝟐 𝑿 𝟑 F(s)s sgn =+1𝑿 𝟎 -1 s=(𝑿 𝟎 𝑾 𝟎+𝑿 𝟏 𝑾 𝟏+𝑿 𝟐 𝑾 𝟐+𝑿 𝟑 𝑾 𝟑) =+1*-1+255*-2+0*1+0*6.2 =-511 RED/BLUE 𝒀(𝒏) B (BLUE)G (GREEN)R (RED) 00255 RED = -1 6880248 25500 BLUE = +1 2101567
  72. 72. Neural Networks Training Example Step n=0 - Output 255 0 0 -2 1 6.2 𝑿 𝟏 𝑿 𝟐 𝑿 𝟑 -511 =+1𝑿 𝟎 -1 𝒀 𝒏 = 𝒀 𝟎 = 𝑺𝑮𝑵 𝒔 = 𝑺𝑮𝑵 −𝟓𝟏𝟏 = −𝟏 sgn RED/BLUE 𝒀(𝒏) B (BLUE)G (GREEN)R (RED) 00255 RED = -1 6880248 25500 BLUE = +1 2101567 𝒔𝒈𝒏 𝒔 = +𝟏, 𝒔 ≥ 𝟎 −𝟏, 𝒔 < 𝟎
  73. 73. Neural Networks Training Example Step n=0 - Output 255 0 0 -2 1 6.2 𝑿 𝟏 𝑿 𝟐 𝑿 𝟑 -511 −𝟏 =+1𝑿 𝟎 -1 𝒀 𝒏 = 𝒀 𝟎 = 𝑺𝑮𝑵 𝒔 = 𝑺𝑮𝑵 −𝟓𝟏𝟏 = −𝟏 -1 RED B (BLUE)G (GREEN)R (RED) 00255 RED = -1 6880248 25500 BLUE = +1 2101567
  74. 74. Neural Networks Training Example Step n=0 Predicted Vs. Desired 255 0 0 -2 1 6.2 𝑿 𝟏 𝑿 𝟐 𝑿 𝟑 -511 −𝟏 =+1𝑿 𝟎 -1 -1 𝒀 𝒏 = 𝒀 𝟎 = −𝟏 𝐝 𝒏 = 𝒅 𝟎 = −𝟏 ∵ 𝒀 𝒏 = 𝒅 𝒏 ∴ Weights are Correct. No Adaptation B (BLUE)G (GREEN)R (RED) 00255 RED = -1 6880248 25500 BLUE = +1 2101567
  75. 75. Neural Networks Training Example Step n=1 • In each step in the solution, the parameters of the neural network must be known. • Parameters of step n=1: η = .001 𝑋 𝑛 = 𝑋 1 = +1, 248, 80, 68 𝑊 𝑛 = 𝑊 1 = 𝑊 0 = −1, −2, 1, 6.2 𝑑 𝑛 = 𝑑 1 = −1 B (BLUE)G (GREEN)R (RED) 00255 RED = -1 6880248 25500 BLUE = +1 2101567
  76. 76. Neural Networks Training Example Step n=1 248 80 68 RED/BLUE -2 1 6.2 𝑿 𝟏 𝑿 𝟐 𝑿 𝟑 F(s)s sgn =+1𝑿 𝟎 -1 𝒀(𝒏) B (BLUE)G (GREEN)R (RED) 00255 RED = -1 6880248 25500 BLUE = +1 2101567
  77. 77. Neural Networks Training Example Step n=1 - SOP 248 80 68 RED/BLUE -2 1 6.2 𝑿 𝟏 𝑿 𝟐 𝑿 𝟑 F(s)s sgn =+1𝑿 𝟎 -1 s=(𝑿 𝟎 𝑾 𝟎+𝑿 𝟏 𝑾 𝟏+𝑿 𝟐 𝑾 𝟐+𝑿 𝟑 𝑾 𝟑) =+1*-1+248*-2+80*1+68*6.2 =4.6 𝒀(𝒏) B (BLUE)G (GREEN)R (RED) 00255 RED = -1 6880248 25500 BLUE = +1 2101567
  78. 78. Neural Networks Training Example Step n=1 - Output 248 80 68 -2 1 6.2 𝑿 𝟏 𝑿 𝟐 𝑿 𝟑 4.6 =+1𝑿 𝟎 -1 𝒀 𝒏 = 𝒀 𝟏 = 𝑺𝑮𝑵 𝒔 = 𝑺𝑮𝑵 𝟒. 𝟔 = +𝟏 sgn RED/BLUE 𝒀(𝒏) B (BLUE)G (GREEN)R (RED) 00255 RED = -1 6880248 25500 BLUE = +1 2101567 𝒔𝒈𝒏 𝒔 = +𝟏, 𝒔 ≥ 𝟎 −𝟏, 𝒔 < 𝟎
  79. 79. Neural Networks Training Example Step n=1 - Output 248 80 68 -2 1 6.2 𝑿 𝟏 𝑿 𝟐 𝑿 𝟑 4.6 +𝟏 =+1𝑿 𝟎 -1 𝒀 𝒏 = 𝒀 𝟏 = 𝑺𝑮𝑵 𝒔 = 𝑺𝑮𝑵 𝟒. 𝟔 = +𝟏 +1 BLUE B (BLUE)G (GREEN)R (RED) 00255 RED = -1 6880248 25500 BLUE = +1 2101567
  80. 80. Neural Networks Training Example Step n=1 Predicted Vs. Desired 248 80 68 -2 1 6.2 𝑿 𝟏 𝑿 𝟐 𝑿 𝟑 4.6 +𝟏 =+1𝑿 𝟎 -1 +1 𝒀 𝒏 = 𝒀 𝟏 = +𝟏 𝐝 𝒏 = 𝒅 𝟏 = −𝟏 ∵ 𝒀 𝒏 ≠ 𝒅 𝒏 ∴ Weights are Incorrect. Adaptation Required B (BLUE)G (GREEN)R (RED) 00255 RED = -1 6880248 25500 BLUE = +1 2101567
  81. 81. Weights Adaptation • According to 𝑾 𝒏 + 𝟏 = 𝑾 𝒏 + η 𝒅 𝒏 − 𝒀 𝒏 𝑿(𝒏) • Where n = 1 𝑾 𝟏 + 𝟏 = 𝑾 𝟏 + η 𝒅 𝟏 − 𝒀 𝟏 𝑿(𝟏) 𝑾 𝟐 = −𝟏, −𝟐, 𝟏, 𝟔. 𝟐 + .001 −𝟏 − (+𝟏) +𝟏, 𝟐𝟒𝟖, 𝟖𝟎, 𝟔𝟖 𝑾 𝟐 = −𝟏, −𝟐, 𝟏, 𝟔. 𝟐 + .001 −𝟐 +𝟏, 𝟐𝟒𝟖, 𝟖𝟎, 𝟔𝟖 𝑾 𝟐 = −𝟏, −𝟐, 𝟏, 𝟔. 𝟐 + −. 𝟎𝟎𝟐 +𝟏, 𝟐𝟒𝟖, 𝟖𝟎, 𝟔𝟖 𝑾 𝟐 = −𝟏, −𝟐, 𝟏, 𝟔. 𝟐 + −. 𝟎𝟎𝟐, −. 𝟒𝟗𝟔, −. 𝟏𝟔, −. 𝟏𝟑𝟔 𝑾 𝟐 = −𝟏. 𝟎𝟎𝟐, −𝟐. 𝟒𝟗𝟔, . 𝟖𝟒, 𝟔. 𝟎𝟔𝟒
  82. 82. Neural Networks Training Example Step n=2 • In each step in the solution, the parameters of the neural network must be known. • Parameters of step n=2: η = .001 𝑋 𝑛 = 𝑋 2 = +1, 0, 0, 255 𝑊 𝑛 = 𝑊 2 = −1.002, −2.496, .84, 6.064 𝑑 𝑛 = 𝑑 2 = +1 B (BLUE)G (GREEN)R (RED) 00255 RED = -1 6880248 25500 BLUE = +1 2101567
  83. 83. Neural Networks Training Example Step n=2 0 0 255 RED/BLUE 𝑿 𝟏 𝑿 𝟐 𝑿 𝟑 F(s)s sgn =+1𝑿 𝟎 𝒀(𝒏) B (BLUE)G (GREEN)R (RED) 00255 RED = -1 6880248 25500 BLUE = +1 2101567 6.064 .84 -2.496 -1.002
  84. 84. Neural Networks Training Example Step n=2 - SOP 0 0 255 RED/BLUE 𝑿 𝟏 𝑿 𝟐 𝑿 𝟑 F(s)s sgn =+1𝑿 𝟎 s=(𝑿 𝟎 𝑾 𝟎+𝑿 𝟏 𝑾 𝟏+𝑿 𝟐 𝑾 𝟐+𝑿 𝟑 𝑾 𝟑) =+1*-1.002+0*- 2.496+0*.84+255*6.064 =1545.32 𝒀(𝒏) B (BLUE)G (GREEN)R (RED) 00255 RED = -1 6880248 25500 BLUE = +1 2101567 6.064 .84 -2.496 -1.002
  85. 85. Neural Networks Training Example Step n=2 - Output 0 0 255 𝑿 𝟏 𝑿 𝟐 𝑿 𝟑 1545 .32 =+1𝑿 𝟎 𝒀 𝒏 = 𝒀 𝟐 = 𝑺𝑮𝑵 𝒔 = 𝑺𝑮𝑵 1545.32 = +𝟏 sgn RED/BLUE 𝒀(𝒏) B (BLUE)G (GREEN)R (RED) 00255 RED = -1 6880248 25500 BLUE = +1 2101567 6.064 .84 -2.496 -1.002 𝒔𝒈𝒏 𝒔 = +𝟏, 𝒔 ≥ 𝟎 −𝟏, 𝒔 < 𝟎
  86. 86. Neural Networks Training Example Step n=2 - Output 0 0 255 𝑿 𝟏 𝑿 𝟐 𝑿 𝟑 1545 .32 +𝟏 =+1𝑿 𝟎 𝒀 𝒏 = 𝒀 𝟐 = 𝑺𝑮𝑵 𝒔 = 𝑺𝑮𝑵 1545.32 = +𝟏 +1 BLUE B (BLUE)G (GREEN)R (RED) 00255 RED = -1 6880248 25500 BLUE = +1 2101567 6.064 .84 -2.496 -1.002
  87. 87. Neural Networks Training Example Step n=2 Predicted Vs. Desired 0 0 255 𝑿 𝟏 𝑿 𝟐 𝑿 𝟑 1545 .32 +𝟏 =+1𝑿 𝟎 +1 𝒀 𝒏 = 𝒀 𝟐 = +𝟏 𝐝 𝒏 = 𝒅 𝟐 = +𝟏 ∵ 𝒀 𝒏 = 𝒅 𝒏 ∴ Weights are Correct. No Adaptation B (BLUE)G (GREEN)R (RED) 00255 RED = -1 6880248 25500 BLUE = +1 2101567 6.064 .84 -2.496 -1.002
  88. 88. Neural Networks Training Example Step n=3 • In each step in the solution, the parameters of the neural network must be known. • Parameters of step n=3: η = .001 𝑋 𝑛 = 𝑋 3 = +1, 67, 15, 210 𝑊 𝑛 = 𝑊 3 = 𝑊 2 = −1.002, −2.496, .84, 6.064 𝑑 𝑛 = 𝑑 3 = +1 B (BLUE)G (GREEN)R (RED) 00255 RED = -1 6880248 25500 BLUE = +1 2101567
  89. 89. Neural Networks Training Example Step n=3 67 15 210 RED/BLUE 𝑿 𝟏 𝑿 𝟐 𝑿 𝟑 F(s)s sgn =+1𝑿 𝟎 𝒀(𝒏) B (BLUE)G (GREEN)R (RED) 00255 RED = -1 6880248 25500 BLUE = +1 2101567 6.064 .84 -2.496 -1.002
  90. 90. Neural Networks Training Example Step n=3 - SOP 67 15 210 RED/BLUE 𝑿 𝟏 𝑿 𝟐 𝑿 𝟑 F(s)s sgn =+1𝑿 𝟎 s=(𝑿 𝟎 𝑾 𝟎+𝑿 𝟏 𝑾 𝟏+𝑿 𝟐 𝑾 𝟐+𝑿 𝟑 𝑾 𝟑) =+1*-1.002+67*- 2.496+15*.84+210*6.064 =1349.542 𝒀(𝒏) B (BLUE)G (GREEN)R (RED) 00255 RED = -1 6880248 25500 BLUE = +1 2101567 6.064 .84 -2.496 -1.002
  91. 91. Neural Networks Training Example Step n=3 - Output 𝑿 𝟏 𝑿 𝟐 𝑿 𝟑 1349 .542 =+1𝑿 𝟎 𝒀 𝒏 = 𝒀 𝟑 = 𝑺𝑮𝑵 𝒔 = 𝑺𝑮𝑵 1349.542 = +𝟏 sgn RED/BLUE 𝒀(𝒏) B (BLUE)G (GREEN)R (RED) 00255 RED = -1 6880248 25500 BLUE = +1 2101567 6.064 .84 -2.496 -1.002 67 15 210 𝒔𝒈𝒏 𝒔 = +𝟏, 𝒔 ≥ 𝟎 −𝟏, 𝒔 < 𝟎
  92. 92. Neural Networks Training Example Step n=3 - Output 𝑿 𝟏 𝑿 𝟐 𝑿 𝟑 1349 .542 +𝟏 =+1𝑿 𝟎 𝒀 𝒏 = 𝒀 𝟑 = 𝑺𝑮𝑵 𝒔 = 𝑺𝑮𝑵 1349.542 = +𝟏 +1 BLUE B (BLUE)G (GREEN)R (RED) 00255 RED = -1 6880248 25500 BLUE = +1 2101567 6.064 .84 -2.496 -1.002 67 15 210
  93. 93. Neural Networks Training Example Step n=3 Predicted Vs. Desired 𝑿 𝟏 𝑿 𝟐 𝑿 𝟑 1349 .542 +𝟏 =+1𝑿 𝟎 +1 𝒀 𝒏 = 𝒀 𝟑 = +𝟏 𝐝 𝒏 = 𝒅 𝟑 = +𝟏 ∵ 𝒀 𝒏 = 𝒅 𝒏 ∴ Weights are Correct. No Adaptation B (BLUE)G (GREEN)R (RED) 00255 RED = -1 6880248 25500 BLUE = +1 2101567 6.064 .84 -2.496 -1.002 67 15 210
  94. 94. Neural Networks Training Example Step n=4 • In each step in the solution, the parameters of the neural network must be known. • Parameters of step n=4: η = .001 𝑋 𝑛 = 𝑋 4 = +1, 255, 0, 0 𝑊 𝑛 = 𝑊 4 = 𝑊 3 = −1.002, −2.496, .84, 6.064 𝑑 𝑛 = 𝑑 4 = −1 B (BLUE)G (GREEN)R (RED) 00255 RED = -1 6880248 25500 BLUE = +1 2101567
  95. 95. Neural Networks Training Example Step n=4 255 0 0 𝑿 𝟏 𝑿 𝟐 𝑿 𝟑 F(s)s sgn =+1𝑿 𝟎 RED/BLUE 𝒀(𝒏) B (BLUE)G (GREEN)R (RED) 00255 RED = -1 6880248 25500 BLUE = +1 2101567 6.064 .84 -2.496 -1.002
  96. 96. Neural Networks Training Example Step n=4 - SOP 255 0 0 𝑿 𝟏 𝑿 𝟐 𝑿 𝟑 F(s)s sgn =+1𝑿 𝟎 s=(𝑿 𝟎 𝑾 𝟎+𝑿 𝟏 𝑾 𝟏+𝑿 𝟐 𝑾 𝟐+𝑿 𝟑 𝑾 𝟑) =+1*-1.002+255*- 2.496+0*.84+0*6.064 =-637.482 RED/BLUE 𝒀(𝒏) B (BLUE)G (GREEN)R (RED) 00255 RED = -1 6880248 25500 BLUE = +1 2101567 6.064 .84 -2.496 -1.002
  97. 97. Neural Networks Training Example Step n=4 - Output 255 0 0 𝑿 𝟏 𝑿 𝟐 𝑿 𝟑 - 637. 482 =+1𝑿 𝟎 𝒀 𝒏 = 𝒀 𝟒 = 𝑺𝑮𝑵 𝒔 = 𝑺𝑮𝑵 −637.482 = −𝟏 sgn RED/BLUE 𝒀(𝒏) B (BLUE)G (GREEN)R (RED) 00255 RED = -1 6880248 25500 BLUE = +1 2101567 6.064 .84 -2.496 -1.002 𝒔𝒈𝒏 𝒔 = +𝟏, 𝒔 ≥ 𝟎 −𝟏, 𝒔 < 𝟎
  98. 98. Neural Networks Training Example Step n=4 - Output 255 0 0 𝑿 𝟏 𝑿 𝟐 𝑿 𝟑 - 637. 482 −𝟏 =+1𝑿 𝟎 𝒀 𝒏 = 𝒀 𝟒 = 𝑺𝑮𝑵 𝒔 = 𝑺𝑮𝑵 −637.482 = −𝟏 -1 RED B (BLUE)G (GREEN)R (RED) 00255 RED = -1 6880248 25500 BLUE = +1 2101567 6.064 .84 -2.496 -1.002
  99. 99. Neural Networks Training Example Step n=4 Predicted Vs. Desired 255 0 0 𝑿 𝟏 𝑿 𝟐 𝑿 𝟑 - 637. 482 −𝟏 =+1𝑿 𝟎 -1 𝒀 𝒏 = 𝒀 𝟒 = −𝟏 𝐝 𝒏 = 𝒅 𝟒 = −𝟏 ∵ 𝒀 𝒏 = 𝒅 𝒏 ∴ Weights are Correct. No Adaptation B (BLUE)G (GREEN)R (RED) 00255 RED = -1 6880248 25500 BLUE = +1 2101567 6.064 .84 -2.496 -1.002
  100. 100. Neural Networks Training Example Step n=5 • In each step in the solution, the parameters of the neural network must be known. • Parameters of step n=5: η = .001 𝑋 𝑛 = 𝑋 5 = +1, 248, 80, 68 𝑊 𝑛 = 𝑊 5 = 𝑊 4 = −1.002, −2.496, .84, 6.064 𝑑 𝑛 = 𝑑 5 = −1 B (BLUE)G (GREEN)R (RED) 00255 RED = -1 6880248 25500 BLUE = +1 2101567
  101. 101. Neural Networks Training Example Step n=5 248 80 68 𝑿 𝟏 𝑿 𝟐 𝑿 𝟑 F(s)s sgn =+1𝑿 𝟎 RED/BLUE 𝒀(𝒏) B (BLUE)G (GREEN)R (RED) 00255 RED = -1 6880248 25500 BLUE = +1 2101567 6.064 .84 -2.496 -1.002
  102. 102. Neural Networks Training Example Step n=5 - SOP 248 80 68 𝑿 𝟏 𝑿 𝟐 𝑿 𝟑 F(s)s sgn =+1𝑿 𝟎 s=(𝑿 𝟎 𝑾 𝟎+𝑿 𝟏 𝑾 𝟏+𝑿 𝟐 𝑾 𝟐+𝑿 𝟑 𝑾 𝟑) =+1*-1.002+248*- 2.496+80*.84+68*6.064 =-31.306 RED/BLUE 𝒀(𝒏) B (BLUE)G (GREEN)R (RED) 00255 RED = -1 6880248 25500 BLUE = +1 2101567 6.064 .84 -2.496 -1.002
  103. 103. Neural Networks Training Example Step n=5 - Output 𝑿 𝟏 𝑿 𝟐 𝑿 𝟑 - 31.3 06 =+1𝑿 𝟎 𝒀 𝒏 = 𝒀 𝟓 = 𝑺𝑮𝑵 𝒔 = 𝑺𝑮𝑵 −31.306 = −𝟏 sgn RED/BLUE 𝒀(𝒏) B (BLUE)G (GREEN)R (RED) 00255 RED = -1 6880248 25500 BLUE = +1 2101567 6.064 .84 -2.496 -1.002 248 80 68 𝒔𝒈𝒏 𝒔 = +𝟏, 𝒔 ≥ 𝟎 −𝟏, 𝒔 < 𝟎
  104. 104. Neural Networks Training Example Step n=5 - Output 𝑿 𝟏 𝑿 𝟐 𝑿 𝟑 - 31.3 06 −𝟏 =+1𝑿 𝟎 𝒀 𝒏 = 𝒀 𝟓 = 𝑺𝑮𝑵 𝒔 = 𝑺𝑮𝑵 −31.306 = −𝟏 -1 RED B (BLUE)G (GREEN)R (RED) 00255 RED = -1 6880248 25500 BLUE = +1 2101567 6.064 .84 -2.496 -1.002 248 80 68
  105. 105. Neural Networks Training Example Step n=5 Predicted Vs. Desired 𝑿 𝟏 𝑿 𝟐 𝑿 𝟑 - 637. 482 −𝟏 =+1𝑿 𝟎 -1 𝒀 𝒏 = 𝒀 𝟓 = −𝟏 𝐝 𝒏 = 𝒅 𝟓 = −𝟏 ∵ 𝒀 𝒏 = 𝒅 𝒏 ∴ Weights are Correct. No Adaptation B (BLUE)G (GREEN)R (RED) 00255 RED = -1 6880248 25500 BLUE = +1 2101567 6.064 .84 -2.496 -1.002 248 80 68
  106. 106. Correct Weights • After testing the weights across all samples and results were correct then we can conclude that current weights are correct ones for training the neural network. • After training phase we come to testing the neural network. • What is the class of the unknown color of values of R=150, G=100, B=180?
  107. 107. Testing Trained Neural Network (R, G, B) = (150, 100, 180) Trained Neural Networks Parameters η = .001 𝑊 = −1.002, −2.496, .84, 6.064
  108. 108. Testing Trained Neural Network (R, G, B) = (150, 100, 180) SOP 150 100 180 𝑿 𝟏 𝑿 𝟐 𝑿 𝟑 F(s)s sgn =+1𝑿 𝟎 s=(𝑿 𝟎 𝑾 𝟎+𝑿 𝟏 𝑾 𝟏+𝑿 𝟐 𝑾 𝟐+𝑿 𝟑 𝑾 𝟑) =+1*-1.002+150*- 2.496+100*.84+180*6.064 =800.118 RED/BLUE 𝒀 6.064 .84 -2.496 -1.002
  109. 109. Testing Trained Neural Network (R, G, B) = (150, 100, 180) Output 𝑿 𝟏 𝑿 𝟐 𝑿 𝟑 800. 118 =+1𝑿 𝟎 𝒀 = 𝑺𝑮𝑵 𝒔 = 𝑺𝑮𝑵 800.118 = +𝟏 sgn RED/BLUE 𝒀 6.064 .84 -2.496 -1.002 150 100 180 𝒔𝒈𝒏 𝒔 = +𝟏, 𝒔 ≥ 𝟎 −𝟏, 𝒔 < 𝟎
  110. 110. Testing Trained Neural Network (R, G, B) = (150, 100, 180) Output 𝑿 𝟏 𝑿 𝟐 𝑿 𝟑 800. 118 +𝟏 =+1𝑿 𝟎 𝒀 = 𝑺𝑮𝑵 𝒔 = 𝑺𝑮𝑵 800.118 = +𝟏 +1 BLUE 6.064 .84 -2.496 -1.002 150 100 180

×