Successfully reported this slideshow.
We use your LinkedIn profile and activity data to personalize ads and to show you more relevant ads. You can change your ad preferences anytime.

Artificial Neural Networks (ANNs) - XOR - Step-By-Step

Artificial neural networks (ANNs) or connectionist systems are a computational model used in machine learning, computer science and other research disciplines, which is based on a large collection of connected simple units called artificial neurons, loosely analogous to axons in a biological brain. Connections between neurons carry an activation signal of varying strength. If the combined incoming signals are strong enough, the neuron becomes activated and the signal travels to other neurons connected to it. Such systems can be trained from examples, rather than explicitly programmed, and excel in areas where the solution or feature detection is difficult to express in a traditional computer program. Like other machine learning methods, neural networks have been used to solve a wide variety of tasks, like computer vision and speech recognition, that are difficult to solve using ordinary rule-based programming.

Find me on:
AFCIT
http://www.afcit.xyz

YouTube
https://www.youtube.com/channel/UCuewOYbBXH5gwhfOrQOZOdw

Google Plus
https://plus.google.com/u/0/+AhmedGadIT

SlideShare
https://www.slideshare.net/AhmedGadFCIT

LinkedIn
https://www.linkedin.com/in/ahmedfgad/

ResearchGate
https://www.researchgate.net/profile/Ahmed_Gad13

Academia
https://www.academia.edu/

Google Scholar
https://scholar.google.com.eg/citations?user=r07tjocAAAAJ&hl=en

Mendelay
https://www.mendeley.com/profiles/ahmed-gad12/

ORCID
https://orcid.org/0000-0003-1978-8574

StackOverFlow
http://stackoverflow.com/users/5426539/ahmed-gad

Twitter
https://twitter.com/ahmedfgad

Facebook
https://www.facebook.com/ahmed.f.gadd

Pinterest
https://www.pinterest.com/ahmedfgad/

Related Books

Free with a 30 day trial from Scribd

See all

Related Audiobooks

Free with a 30 day trial from Scribd

See all
  • Be the first to comment

Artificial Neural Networks (ANNs) - XOR - Step-By-Step

  1. 1. Artificial Neural Networks (ANNs) XOR Step-By-Step MENOUFIA UNIVERSITY FACULTY OF COMPUTERS AND INFORMATION ALL DEPARTMENTS ARTIFICIAL INTELLIGENCE โ€ซุงู„ู…ู†ูˆููŠุฉโ€ฌ โ€ซุฌุงู…ุนุฉโ€ฌ โ€ซูˆุงู„ู…ุนู„ูˆู…ุงุชโ€ฌ โ€ซุงู„ุญุงุณุจุงุชโ€ฌ โ€ซูƒู„ูŠุฉโ€ฌ โ€ซุงุฃู„ู‚ุณุงู…โ€ฌ โ€ซุฌู…ูŠุนโ€ฌ โ€ซุงู„ุฐูƒุงุกโ€ฌโ€ซุงุฅู„ุตุทู†ุงุนูŠโ€ฌ โ€ซุงู„ู…ู†ูˆููŠุฉโ€ฌ โ€ซุฌุงู…ุนุฉโ€ฌ Ahmed Fawzy Gad ahmed.fawzy@ci.menofia.edu.eg
  2. 2. Classification Example BA 01 1 10 00 0 11
  3. 3. Neural Networks Input Hidden Output BA 01 1 10 00 0 11
  4. 4. Neural Networks BA 01 1 10 00 0 11 0 0.2 0.4 0.6 0.8 1 1.2 0 0.2 0.4 0.6 0.8 1 1.2
  5. 5. Neural Networks BA 01 1 10 00 0 11 0 0.2 0.4 0.6 0.8 1 1.2 0 0.2 0.4 0.6 0.8 1 1.2 Can`t be Solved Linearly. Single Layer Perceptron Can`t Work. Use Hidden Layer.
  6. 6. Neural Networks BA 01 1 10 00 0 11 Input OutputHidden
  7. 7. Input Layer Input Output A B BA 01 1 10 00 0 11 0 0.2 0.4 0.6 0.8 1 1.2 0 0.2 0.4 0.6 0.8 1 1.2 Hidden
  8. 8. Hidden Layer Start by Two Neurons Input Output BA 01 1 10 00 0 11 0 0.2 0.4 0.6 0.8 1 1.2 0 0.2 0.4 0.6 0.8 1 1.2 Hidden A B
  9. 9. Output Layer Input Output BA 01 1 10 00 0 11 0 0.2 0.4 0.6 0.8 1 1.2 0 0.2 0.4 0.6 0.8 1 1.2 Hidden 1/0 ๐’€๐’‹ A B
  10. 10. Weights Input Output BA 01 1 10 00 0 11 0 0.2 0.4 0.6 0.8 1 1.2 0 0.2 0.4 0.6 0.8 1 1.2 Hidden Weights=๐‘พ๐’Š A B 1/0 ๐’€๐’‹
  11. 11. Weights Input Layer โ€“ Hidden Layer Input Output BA 01 1 10 00 0 11 0 0.2 0.4 0.6 0.8 1 1.2 0 0.2 0.4 0.6 0.8 1 1.2 Hidden Weights=๐‘พ๐’Š ๐‘พ ๐Ÿ ๐‘พ ๐Ÿ ๐‘พ ๐Ÿ‘ ๐‘พ ๐Ÿ’ A B 1/0 ๐’€๐’‹
  12. 12. Weights Hidden Layer โ€“ Output Layer Input Output BA 01 1 10 00 0 11 0 0.2 0.4 0.6 0.8 1 1.2 0 0.2 0.4 0.6 0.8 1 1.2 Hidden Weights=๐‘พ๐’Š ๐‘พ ๐Ÿ“ ๐‘พ ๐Ÿ” A B 1/0 ๐’€๐’‹ ๐‘พ ๐Ÿ ๐‘พ ๐Ÿ ๐‘พ ๐Ÿ‘ ๐‘พ ๐Ÿ’
  13. 13. All Layers Input Output BA 01 1 10 00 0 11 0 0.2 0.4 0.6 0.8 1 1.2 0 0.2 0.4 0.6 0.8 1 1.2 Hidden Weights=๐‘พ๐’Š ๐‘พ ๐Ÿ“ ๐‘พ ๐Ÿ” 1/0 ๐’€๐’‹ A B ๐‘พ ๐Ÿ ๐‘พ ๐Ÿ ๐‘พ ๐Ÿ‘ ๐‘พ ๐Ÿ’
  14. 14. Activation Function Output ๐’€๐’‹ BA 01 1 10 00 0 11 Input Hidden 1/0 0 0.2 0.4 0.6 0.8 1 1.2 0 0.2 0.4 0.6 0.8 1 1.2 ๐‘พ ๐Ÿ“ ๐‘พ ๐Ÿ” A B ๐‘พ ๐Ÿ ๐‘พ ๐Ÿ ๐‘พ ๐Ÿ‘ ๐‘พ ๐Ÿ’
  15. 15. Activation Function Output ๐’€๐’‹ BA 01 1 10 00 0 11 Input Hidden 0 0.2 0.4 0.6 0.8 1 1.2 0 0.2 0.4 0.6 0.8 1 1.2 1/0 ๐‘พ ๐Ÿ“ ๐‘พ ๐Ÿ” A B ๐‘พ ๐Ÿ ๐‘พ ๐Ÿ ๐‘พ ๐Ÿ‘ ๐‘พ ๐Ÿ’
  16. 16. Activation Function Output ๐’€๐’‹ BA 01 1 10 00 0 11 0 0.2 0.4 0.6 0.8 1 1.2 0 0.2 0.4 0.6 0.8 1 1.2 Input Hidden 1/0 ๐‘พ ๐Ÿ“ ๐‘พ ๐Ÿ” A B ๐‘พ ๐Ÿ ๐‘พ ๐Ÿ ๐‘พ ๐Ÿ‘ ๐‘พ ๐Ÿ’
  17. 17. Activation Function Components Output ๐’€๐’‹ BA 01 1 10 00 0 11 0 0.2 0.4 0.6 0.8 1 1.2 0 0.2 0.4 0.6 0.8 1 1.2 Input Hidden 1/0 ๐‘พ ๐Ÿ“ ๐‘พ ๐Ÿ” A B ๐‘พ ๐Ÿ ๐‘พ ๐Ÿ ๐‘พ ๐Ÿ‘ ๐‘พ ๐Ÿ’
  18. 18. Activation Function Inputs Output s ๐’€๐’‹ BA 01 1 10 00 0 11 0 0.2 0.4 0.6 0.8 1 1.2 0 0.2 0.4 0.6 0.8 1 1.2 Input Hidden 1/0 ๐‘พ ๐Ÿ“ ๐‘พ ๐Ÿ” A B ๐‘พ ๐Ÿ ๐‘พ ๐Ÿ ๐‘พ ๐Ÿ‘ ๐‘พ ๐Ÿ’
  19. 19. Activation Function Inputs Output s ๐’€๐’‹ BA 01 1 10 00 0 11 0 0.2 0.4 0.6 0.8 1 1.2 0 0.2 0.4 0.6 0.8 1 1.2 Input Hidden 1/0 ๐‘พ ๐Ÿ“ ๐‘พ ๐Ÿ” A B ๐‘พ ๐Ÿ ๐‘พ ๐Ÿ ๐‘พ ๐Ÿ‘ ๐‘พ ๐Ÿ’ s=SOP(๐‘ฟ๐’Š, ๐‘พ๐’Š)
  20. 20. Activation Function Inputs Output s ๐‘ฟ๐’Š=Inputs ๐‘พ๐’Š=Weights ๐’€๐’‹ BA 01 1 10 00 0 11 0 0.2 0.4 0.6 0.8 1 1.2 0 0.2 0.4 0.6 0.8 1 1.2 Input Hidden 1/0 ๐‘พ ๐Ÿ“ ๐‘พ ๐Ÿ” A B ๐‘พ ๐Ÿ ๐‘พ ๐Ÿ ๐‘พ ๐Ÿ‘ ๐‘พ ๐Ÿ’ s=SOP(๐‘ฟ๐’Š, ๐‘พ๐’Š)
  21. 21. Activation Function Inputs Output s ๐‘ฟ๐’Š=Inputs ๐‘พ๐’Š=Weights ๐‘ฟ ๐Ÿ ๐‘ฟ ๐Ÿ ๐’€๐’‹ BA 01 1 10 00 0 11 0 0.2 0.4 0.6 0.8 1 1.2 0 0.2 0.4 0.6 0.8 1 1.2 Input Hidden 1/0 ๐‘พ ๐Ÿ“ ๐‘พ ๐Ÿ” A B ๐‘พ ๐Ÿ ๐‘พ ๐Ÿ ๐‘พ ๐Ÿ‘ ๐‘พ ๐Ÿ’ s=SOP(๐‘ฟ๐’Š, ๐‘พ๐’Š)
  22. 22. ๐‘พ ๐Ÿ“ ๐‘พ ๐Ÿ” A B ๐‘พ ๐Ÿ ๐‘พ ๐Ÿ ๐‘พ ๐Ÿ‘ ๐‘พ ๐Ÿ’ Activation Function Inputs Output s ๐‘ฟ๐’Š=Inputs ๐‘พ๐’Š=Weights ๐‘ฟ ๐Ÿ ๐‘ฟ ๐Ÿ ๐’€๐’‹ BA 01 1 10 00 0 11 0 0.2 0.4 0.6 0.8 1 1.2 0 0.2 0.4 0.6 0.8 1 1.2 Input Hidden S= ๐Ÿ ๐’Ž ๐‘ฟ๐’Š ๐‘พ๐’Š s=SOP(๐‘ฟ๐’Š, ๐‘พ๐’Š) 1/0
  23. 23. Activation Function Inputs Output s ๐‘ฟ ๐Ÿ ๐‘ฟ ๐Ÿ ๐’€๐’‹ BA 01 1 10 00 0 11 0 0.2 0.4 0.6 0.8 1 1.2 0 0.2 0.4 0.6 0.8 1 1.2 Input Hidden ๐‘พ ๐Ÿ“ ๐‘พ ๐Ÿ” A B ๐‘พ ๐Ÿ ๐‘พ ๐Ÿ ๐‘พ ๐Ÿ‘ ๐‘พ ๐Ÿ’ S= ๐Ÿ ๐’Ž ๐‘ฟ๐’Š ๐‘พ๐’Š 1/0 Each Hidden/Output Layer Neuron has its SOP.
  24. 24. Activation Function Inputs Output s ๐‘ฟ ๐Ÿ ๐‘ฟ ๐Ÿ ๐‘บ ๐Ÿ=(๐‘ฟ ๐Ÿ ๐‘พ ๐Ÿ+๐‘ฟ ๐Ÿ ๐‘พ ๐Ÿ‘) ๐’€๐’‹ BA 01 1 10 00 0 11 0 0.2 0.4 0.6 0.8 1 1.2 0 0.2 0.4 0.6 0.8 1 1.2 Input Hidden ๐‘พ ๐Ÿ“ ๐‘พ ๐Ÿ” A B ๐‘พ ๐Ÿ ๐‘พ ๐Ÿ ๐‘พ ๐Ÿ‘ ๐‘พ ๐Ÿ’ S= ๐Ÿ ๐’Ž ๐‘ฟ๐’Š ๐‘พ๐’Š 1/0
  25. 25. Activation Function Inputs Output s ๐‘ฟ ๐Ÿ ๐‘ฟ ๐Ÿ ๐‘บ ๐Ÿ=(๐‘ฟ ๐Ÿ ๐‘พ ๐Ÿ+๐‘ฟ ๐Ÿ ๐‘พ ๐Ÿ’) ๐’€๐’‹ BA 01 1 10 00 0 11 0 0.2 0.4 0.6 0.8 1 1.2 0 0.2 0.4 0.6 0.8 1 1.2 Input Hidden ๐‘พ ๐Ÿ“ ๐‘พ ๐Ÿ” A B ๐‘พ ๐Ÿ ๐‘พ ๐Ÿ ๐‘พ ๐Ÿ‘ ๐‘พ ๐Ÿ’ S= ๐Ÿ ๐’Ž ๐‘ฟ๐’Š ๐‘พ๐’Š 1/0
  26. 26. Activation Function Inputs Output s ๐‘ฟ ๐Ÿ ๐‘ฟ ๐Ÿ ๐‘บ ๐Ÿ‘=(๐‘บ ๐Ÿ ๐‘พ ๐Ÿ“+๐‘บ ๐Ÿ ๐‘พ ๐Ÿ”) ๐’€๐’‹ BA 01 1 10 00 0 11 0 0.2 0.4 0.6 0.8 1 1.2 0 0.2 0.4 0.6 0.8 1 1.2 Input Hidden ๐‘พ ๐Ÿ“ ๐‘พ ๐Ÿ” A B ๐‘พ ๐Ÿ ๐‘พ ๐Ÿ ๐‘พ ๐Ÿ‘ ๐‘พ ๐Ÿ’ S= ๐Ÿ ๐’Ž ๐‘ฟ๐’Š ๐‘พ๐’Š 1/0
  27. 27. Activation Function Outputs Output F(s)s ๐‘ฟ ๐Ÿ ๐‘ฟ ๐Ÿ Class Label ๐’€๐’‹ BA 01 1 10 00 0 11 0 0.2 0.4 0.6 0.8 1 1.2 0 0.2 0.4 0.6 0.8 1 1.2 Input Hidden 1/0 ๐‘พ ๐Ÿ“ ๐‘พ ๐Ÿ” A B ๐‘พ ๐Ÿ ๐‘พ ๐Ÿ ๐‘พ ๐Ÿ‘ ๐‘พ ๐Ÿ’
  28. 28. Activation Function Outputs Output F(s)s ๐‘ฟ ๐Ÿ ๐‘ฟ ๐Ÿ Class Label ๐’€๐’‹ BA 01 1 10 00 0 11 0 0.2 0.4 0.6 0.8 1 1.2 0 0.2 0.4 0.6 0.8 1 1.2 Input Hidden 1/0 Each Hidden/Output Layer Neuron has its Activation Function. ๐‘พ ๐Ÿ“ ๐‘พ ๐Ÿ” A B ๐‘พ ๐Ÿ ๐‘พ ๐Ÿ ๐‘พ ๐Ÿ‘ ๐‘พ ๐Ÿ’
  29. 29. Activation Functions Piecewise Linear Sigmoid Binary
  30. 30. Activation Functions Which activation function to use? Outputs Class Labels Activation Function TWO Class Labels TWO Outputs One that gives two outputs. Which activation function to use? ๐‘ช๐’‹๐’€๐’‹ BA 01 1 10 00 0 11 BA 01 1 10 00 0 11
  31. 31. Activation Functions Piecewise Linear Sigmoid BinaryBinary
  32. 32. Activation Function Output F(s)s ๐‘ฟ ๐Ÿ ๐‘ฟ ๐Ÿ bin ๐’€๐’‹ BA 01 1 10 00 0 11 0 0.2 0.4 0.6 0.8 1 1.2 0 0.2 0.4 0.6 0.8 1 1.2 1/0 Input Hidden ๐‘พ ๐Ÿ“ ๐‘พ ๐Ÿ” A B ๐‘พ ๐Ÿ ๐‘พ ๐Ÿ ๐‘พ ๐Ÿ‘ ๐‘พ ๐Ÿ’
  33. 33. Bias Input Output BA 01 1 10 00 0 11 F(s)s ๐‘ฟ ๐Ÿ ๐‘ฟ ๐Ÿ bin ๐’€๐’‹ 1/0 ๐‘พ ๐Ÿ“ ๐‘พ ๐Ÿ” A B ๐‘พ ๐Ÿ ๐‘พ ๐Ÿ ๐‘พ ๐Ÿ‘ ๐‘พ ๐Ÿ’
  34. 34. Bias Hidden Layer Neurons Input Output BA 01 1 10 00 0 11 F(s)s ๐‘ฟ ๐Ÿ ๐‘ฟ ๐Ÿ bin ๐’€๐’‹ ๐‘พ ๐Ÿ“ ๐‘พ ๐Ÿ” A B ๐‘พ ๐Ÿ ๐‘พ ๐Ÿ ๐‘พ ๐Ÿ‘ ๐‘พ ๐Ÿ’ +1 ๐’ƒ ๐Ÿ +1 ๐’ƒ ๐Ÿ 1/0
  35. 35. Bias Output Layer Neurons Input Output BA 01 1 10 00 0 11 F(s)s ๐‘ฟ ๐Ÿ ๐‘ฟ ๐Ÿ bin ๐’€๐’‹ +1 ๐’ƒ ๐Ÿ‘ 1/0 ๐‘พ ๐Ÿ“ ๐‘พ ๐Ÿ” A B ๐‘พ ๐Ÿ ๐‘พ ๐Ÿ ๐‘พ ๐Ÿ‘ ๐‘พ ๐Ÿ’
  36. 36. All Bias Values Input Output BA 01 1 10 00 0 11 F(s)s ๐‘ฟ ๐Ÿ ๐‘ฟ ๐Ÿ bin ๐’€๐’‹ +1 ๐’ƒ ๐Ÿ +1 ๐’ƒ ๐Ÿ 1/0 +1 ๐’ƒ ๐Ÿ‘ ๐‘พ ๐Ÿ“ ๐‘พ ๐Ÿ” A B ๐‘พ ๐Ÿ ๐‘พ ๐Ÿ ๐‘พ ๐Ÿ‘ ๐‘พ ๐Ÿ’
  37. 37. Bias Add Bias to SOP Input Output BA 01 1 10 00 0 11 F(s)s ๐‘ฟ ๐Ÿ ๐‘ฟ ๐Ÿ bin ๐’€๐’‹ +1 ๐’ƒ ๐Ÿ +1 ๐’ƒ ๐Ÿ ๐‘บ ๐Ÿ=(๐‘ฟ ๐Ÿ ๐‘พ ๐Ÿ+๐‘ฟ ๐Ÿ ๐‘พ ๐Ÿ‘) ๐‘บ ๐Ÿ=(๐‘ฟ ๐Ÿ ๐‘พ ๐Ÿ+๐‘ฟ ๐Ÿ ๐‘พ ๐Ÿ’) ๐‘บ ๐Ÿ‘=(๐‘บ ๐Ÿ ๐‘พ ๐Ÿ“+๐‘บ ๐Ÿ ๐‘พ ๐Ÿ”) 1/0 +1 ๐’ƒ ๐Ÿ‘ ๐‘พ ๐Ÿ“ ๐‘พ ๐Ÿ” A B ๐‘พ ๐Ÿ ๐‘พ ๐Ÿ ๐‘พ ๐Ÿ‘ ๐‘พ ๐Ÿ’
  38. 38. Bias Add Bias to SOP Input Output BA 01 1 10 00 0 11 F(s)s ๐‘ฟ ๐Ÿ ๐‘ฟ ๐Ÿ bin ๐’€๐’‹ +1 ๐’ƒ ๐Ÿ +1 ๐’ƒ ๐Ÿ ๐‘บ ๐Ÿ=(๐‘ฟ ๐Ÿ ๐‘พ ๐Ÿ+๐‘ฟ ๐Ÿ ๐‘พ ๐Ÿ‘) 1/0 +1 ๐’ƒ ๐Ÿ‘ ๐‘บ ๐Ÿ=(+๐Ÿ๐’ƒ ๐Ÿ+๐‘ฟ ๐Ÿ ๐‘พ ๐Ÿ+๐‘ฟ ๐Ÿ ๐‘พ ๐Ÿ‘) ๐‘พ ๐Ÿ“ ๐‘พ ๐Ÿ” A B ๐‘พ ๐Ÿ ๐‘พ ๐Ÿ ๐‘พ ๐Ÿ‘ ๐‘พ ๐Ÿ’
  39. 39. Bias Add Bias to SOP Input Output BA 01 1 10 00 0 11 F(s)s ๐‘ฟ ๐Ÿ ๐‘ฟ ๐Ÿ bin ๐’€๐’‹ +1 ๐’ƒ ๐Ÿ +1 ๐’ƒ ๐Ÿ ๐‘บ ๐Ÿ=(๐‘ฟ ๐Ÿ ๐‘พ ๐Ÿ+๐‘ฟ ๐Ÿ ๐‘พ ๐Ÿ’) 1/0 +1 ๐’ƒ ๐Ÿ‘ ๐‘บ ๐Ÿ=(+๐Ÿ๐’ƒ ๐Ÿ+๐‘ฟ ๐Ÿ ๐‘พ ๐Ÿ+๐‘ฟ ๐Ÿ ๐‘พ ๐Ÿ’) ๐‘พ ๐Ÿ“ ๐‘พ ๐Ÿ” A B ๐‘พ ๐Ÿ ๐‘พ ๐Ÿ ๐‘พ ๐Ÿ‘ ๐‘พ ๐Ÿ’
  40. 40. Bias Add Bias to SOP Input Output BA 01 1 10 00 0 11 F(s)s ๐‘ฟ ๐Ÿ ๐‘ฟ ๐Ÿ bin ๐’€๐’‹ +1 ๐’ƒ ๐Ÿ +1 ๐’ƒ ๐Ÿ ๐‘บ ๐Ÿ‘=(๐‘บ ๐Ÿ ๐‘พ ๐Ÿ“+๐‘บ ๐Ÿ ๐‘พ ๐Ÿ”) 1/0 +1 ๐’ƒ ๐Ÿ‘ ๐‘บ ๐Ÿ‘=(+๐Ÿ๐’ƒ ๐Ÿ‘+๐‘บ ๐Ÿ ๐‘พ ๐Ÿ“+๐‘บ ๐Ÿ ๐‘พ ๐Ÿ”) ๐‘พ ๐Ÿ“ ๐‘พ ๐Ÿ” A B ๐‘พ ๐Ÿ ๐‘พ ๐Ÿ ๐‘พ ๐Ÿ‘ ๐‘พ ๐Ÿ’
  41. 41. Bias Importance Input Output BA 01 1 10 00 0 11 F(s)s ๐‘ฟ ๐Ÿ ๐‘ฟ ๐Ÿ bin ๐’€๐’‹ +1 ๐’ƒ ๐Ÿ +1 ๐’ƒ ๐Ÿ 1/0 +1 ๐’ƒ ๐Ÿ‘ ๐‘พ ๐Ÿ“ ๐‘พ ๐Ÿ” A B ๐‘พ ๐Ÿ ๐‘พ ๐Ÿ ๐‘พ ๐Ÿ‘ ๐‘พ ๐Ÿ’
  42. 42. Bias Importance Input Output X Y BA 01 1 10 00 0 11 F(s)s ๐‘ฟ ๐Ÿ ๐‘ฟ ๐Ÿ bin ๐’€๐’‹ +1 ๐’ƒ ๐Ÿ +1 ๐’ƒ ๐Ÿ 1/0 +1 ๐’ƒ ๐Ÿ‘ ๐‘พ ๐Ÿ“ ๐‘พ ๐Ÿ” A B ๐‘พ ๐Ÿ ๐‘พ ๐Ÿ ๐‘พ ๐Ÿ‘ ๐‘พ ๐Ÿ’
  43. 43. Bias Importance Input Output X Y y=ax+b BA 01 1 10 00 0 11 F(s)s ๐‘ฟ ๐Ÿ ๐‘ฟ ๐Ÿ bin ๐’€๐’‹ +1 ๐’ƒ ๐Ÿ +1 ๐’ƒ ๐Ÿ 1/0 +1 ๐’ƒ ๐Ÿ‘ ๐‘พ ๐Ÿ“ ๐‘พ ๐Ÿ” A B ๐‘พ ๐Ÿ ๐‘พ ๐Ÿ ๐‘พ ๐Ÿ‘ ๐‘พ ๐Ÿ’
  44. 44. Bias Importance Input Output X Y y=ax+b BA 01 1 10 00 0 11 F(s)s ๐‘ฟ ๐Ÿ ๐‘ฟ ๐Ÿ bin ๐’€๐’‹ +1 ๐’ƒ ๐Ÿ +1 ๐’ƒ ๐Ÿ 1/0 +1 ๐’ƒ ๐Ÿ‘ ๐‘พ ๐Ÿ“ ๐‘พ ๐Ÿ” A B ๐‘พ ๐Ÿ ๐‘พ ๐Ÿ ๐‘พ ๐Ÿ‘ ๐‘พ ๐Ÿ’
  45. 45. Bias Importance Input Output X Y y=ax+b Y-Intercept BA 01 1 10 00 0 11 F(s)s ๐‘ฟ ๐Ÿ ๐‘ฟ ๐Ÿ bin ๐’€๐’‹ +1 ๐’ƒ ๐Ÿ +1 ๐’ƒ ๐Ÿ 1/0 +1 ๐’ƒ ๐Ÿ‘ ๐‘พ ๐Ÿ“ ๐‘พ ๐Ÿ” A B ๐‘พ ๐Ÿ ๐‘พ ๐Ÿ ๐‘พ ๐Ÿ‘ ๐‘พ ๐Ÿ’
  46. 46. Bias Importance Input Output X Y y=ax+b Y-Intercept b=0 BA 01 1 10 00 0 11 F(s)s ๐‘ฟ ๐Ÿ ๐‘ฟ ๐Ÿ bin ๐’€๐’‹ +1 ๐’ƒ ๐Ÿ +1 ๐’ƒ ๐Ÿ 1/0 +1 ๐’ƒ ๐Ÿ‘ ๐‘พ ๐Ÿ“ ๐‘พ ๐Ÿ” A B ๐‘พ ๐Ÿ ๐‘พ ๐Ÿ ๐‘พ ๐Ÿ‘ ๐‘พ ๐Ÿ’
  47. 47. Bias Importance Input Output X Y Y-Intercept b=0 BA 01 1 10 00 0 11 F(s)s ๐‘ฟ ๐Ÿ ๐‘ฟ ๐Ÿ bin ๐’€๐’‹ +1 ๐’ƒ ๐Ÿ +1 ๐’ƒ ๐Ÿ 1/0 +1 ๐’ƒ ๐Ÿ‘ ๐‘พ ๐Ÿ“ ๐‘พ ๐Ÿ” A B ๐‘พ ๐Ÿ ๐‘พ ๐Ÿ ๐‘พ ๐Ÿ‘ ๐‘พ ๐Ÿ’ y=ax+b
  48. 48. Bias Importance Input Output X Y Y-Intercept b=+v BA 01 1 10 00 0 11 F(s)s ๐‘ฟ ๐Ÿ ๐‘ฟ ๐Ÿ bin ๐’€๐’‹ +1 ๐’ƒ ๐Ÿ +1 ๐’ƒ ๐Ÿ 1/0 +1 ๐’ƒ ๐Ÿ‘ ๐‘พ ๐Ÿ“ ๐‘พ ๐Ÿ” A B ๐‘พ ๐Ÿ ๐‘พ ๐Ÿ ๐‘พ ๐Ÿ‘ ๐‘พ ๐Ÿ’ y=ax+b
  49. 49. Bias Importance Input Output X Y Y-Intercept b=+v BA 01 1 10 00 0 11 F(s)s ๐‘ฟ ๐Ÿ ๐‘ฟ ๐Ÿ bin ๐’€๐’‹ +1 ๐’ƒ ๐Ÿ +1 ๐’ƒ ๐Ÿ 1/0 +1 ๐’ƒ ๐Ÿ‘ ๐‘พ ๐Ÿ“ ๐‘พ ๐Ÿ” A B ๐‘พ ๐Ÿ ๐‘พ ๐Ÿ ๐‘พ ๐Ÿ‘ ๐‘พ ๐Ÿ’ y=ax+b
  50. 50. Bias Importance Input Output X Y Y-Intercept b=-v BA 01 1 10 00 0 11 F(s)s ๐‘ฟ ๐Ÿ ๐‘ฟ ๐Ÿ bin ๐’€๐’‹ +1 ๐’ƒ ๐Ÿ +1 ๐’ƒ ๐Ÿ 1/0 +1 ๐’ƒ ๐Ÿ‘ ๐‘พ ๐Ÿ“ ๐‘พ ๐Ÿ” A B ๐‘พ ๐Ÿ ๐‘พ ๐Ÿ ๐‘พ ๐Ÿ‘ ๐‘พ ๐Ÿ’ y=ax+b
  51. 51. Bias Importance Input Output X Y Y-Intercept b=+v BA 01 1 10 00 0 11 F(s)s ๐‘ฟ ๐Ÿ ๐‘ฟ ๐Ÿ bin ๐’€๐’‹ +1 ๐’ƒ ๐Ÿ +1 ๐’ƒ ๐Ÿ 1/0 +1 ๐’ƒ ๐Ÿ‘ ๐‘พ ๐Ÿ“ ๐‘พ ๐Ÿ” A B ๐‘พ ๐Ÿ ๐‘พ ๐Ÿ ๐‘พ ๐Ÿ‘ ๐‘พ ๐Ÿ’ y=ax+b
  52. 52. Bias Importance Input Output Same Concept Applies to Bias S= ๐Ÿ ๐’Ž ๐‘ฟ๐’Š ๐‘พ๐’Š+BIAS BA 01 1 10 00 0 11 F(s)s ๐‘ฟ ๐Ÿ ๐‘ฟ ๐Ÿ bin ๐’€๐’‹ +1 ๐’ƒ ๐Ÿ +1 ๐’ƒ ๐Ÿ 1/0 +1 ๐’ƒ ๐Ÿ‘ ๐‘พ ๐Ÿ“ ๐‘พ ๐Ÿ” A B ๐‘พ ๐Ÿ ๐‘พ ๐Ÿ ๐‘พ ๐Ÿ‘ ๐‘พ ๐Ÿ’
  53. 53. Bias Importance Input Output S= ๐Ÿ ๐’Ž ๐‘ฟ๐’Š ๐‘พ๐’Š+BIAS BA 01 1 10 00 0 11 F(s)s ๐‘ฟ ๐Ÿ ๐‘ฟ ๐Ÿ bin ๐’€๐’‹ +1 ๐’ƒ ๐Ÿ +1 ๐’ƒ ๐Ÿ 1/0 +1 ๐’ƒ ๐Ÿ‘ ๐‘พ ๐Ÿ“ ๐‘พ ๐Ÿ” A B ๐‘พ ๐Ÿ ๐‘พ ๐Ÿ ๐‘พ ๐Ÿ‘ ๐‘พ ๐Ÿ’
  54. 54. Bias Importance Input Output S= ๐Ÿ ๐’Ž ๐‘ฟ๐’Š ๐‘พ๐’Š+BIAS BA 01 1 10 00 0 11 F(s)s ๐‘ฟ ๐Ÿ ๐‘ฟ ๐Ÿ bin ๐’€๐’‹ +1 ๐’ƒ ๐Ÿ +1 ๐’ƒ ๐Ÿ 1/0 +1 ๐’ƒ ๐Ÿ‘ ๐‘พ ๐Ÿ“ ๐‘พ ๐Ÿ” A B ๐‘พ ๐Ÿ ๐‘พ ๐Ÿ ๐‘พ ๐Ÿ‘ ๐‘พ ๐Ÿ’
  55. 55. Bias Importance Input Output S= ๐Ÿ ๐’Ž ๐‘ฟ๐’Š ๐‘พ๐’Š+BIAS BA 01 1 10 00 0 11 F(s)s ๐‘ฟ ๐Ÿ ๐‘ฟ ๐Ÿ bin ๐’€๐’‹ +1 ๐’ƒ ๐Ÿ +1 ๐’ƒ ๐Ÿ 1/0 +1 ๐’ƒ ๐Ÿ‘ ๐‘พ ๐Ÿ“ ๐‘พ ๐Ÿ” A B ๐‘พ ๐Ÿ ๐‘พ ๐Ÿ ๐‘พ ๐Ÿ‘ ๐‘พ ๐Ÿ’
  56. 56. Bias Importance Input Output S= ๐Ÿ ๐’Ž ๐‘ฟ๐’Š ๐‘พ๐’Š+BIAS BA 01 1 10 00 0 11 F(s)s ๐‘ฟ ๐Ÿ ๐‘ฟ ๐Ÿ bin ๐’€๐’‹ +1 ๐’ƒ ๐Ÿ +1 ๐’ƒ ๐Ÿ 1/0 +1 ๐’ƒ ๐Ÿ‘ ๐‘พ ๐Ÿ“ ๐‘พ ๐Ÿ” A B ๐‘พ ๐Ÿ ๐‘พ ๐Ÿ ๐‘พ ๐Ÿ‘ ๐‘พ ๐Ÿ’
  57. 57. Learning Rate ๐ŸŽ โ‰ค ฮท โ‰ค ๐Ÿ F(s)s ๐‘ฟ ๐Ÿ ๐‘ฟ ๐Ÿ bin ๐’€๐’‹ +1 ๐’ƒ ๐Ÿ +1 ๐’ƒ ๐Ÿ 1/0 +1 ๐’ƒ ๐Ÿ‘ ๐‘พ ๐Ÿ“ ๐‘พ ๐Ÿ” A B ๐‘พ ๐Ÿ ๐‘พ ๐Ÿ ๐‘พ ๐Ÿ‘ ๐‘พ ๐Ÿ’
  58. 58. Summary of Parameters Inputs ๐‘ฟ ๐’Ž ๐ŸŽ โ‰ค ฮท โ‰ค ๐Ÿ ๐‘ฟ(๐’)=(๐‘ฟ ๐ŸŽ, ๐‘ฟ ๐Ÿ,๐‘ฟ ๐Ÿ, โ€ฆ) F(s)s ๐‘ฟ ๐Ÿ ๐‘ฟ ๐Ÿ bin ๐’€๐’‹ +1 ๐’ƒ ๐Ÿ +1 ๐’ƒ ๐Ÿ 1/0 +1 ๐’ƒ ๐Ÿ‘ ๐‘พ ๐Ÿ“ ๐‘พ ๐Ÿ” A B ๐‘พ ๐Ÿ ๐‘พ ๐Ÿ ๐‘พ ๐Ÿ‘ ๐‘พ ๐Ÿ’
  59. 59. Summary of Parameters Weights ๐‘พ ๐’Ž ๐ŸŽ โ‰ค ฮท โ‰ค ๐Ÿ W(๐’)=(๐‘พ ๐ŸŽ, ๐‘พ ๐Ÿ,๐‘พ ๐Ÿ, โ€ฆ) ๐‘ฟ(๐’)=(๐‘ฟ ๐ŸŽ, ๐‘ฟ ๐Ÿ,๐‘ฟ ๐Ÿ, โ€ฆ) F(s)s ๐‘ฟ ๐Ÿ ๐‘ฟ ๐Ÿ bin ๐’€๐’‹ +1 ๐’ƒ ๐Ÿ +1 ๐’ƒ ๐Ÿ 1/0 +1 ๐’ƒ ๐Ÿ‘ ๐‘พ ๐Ÿ“ ๐‘พ ๐Ÿ” A B ๐‘พ ๐Ÿ ๐‘พ ๐Ÿ ๐‘พ ๐Ÿ‘ ๐‘พ ๐Ÿ’
  60. 60. Summary of Parameters Bias ๐ŸŽ โ‰ค ฮท โ‰ค ๐Ÿ ๐‘ฟ(๐’)=(๐‘ฟ ๐ŸŽ, ๐‘ฟ ๐Ÿ,๐‘ฟ ๐Ÿ, โ€ฆ) W(๐’)=(๐‘พ ๐ŸŽ, ๐‘พ ๐Ÿ,๐‘พ ๐Ÿ, โ€ฆ) F(s)s ๐‘ฟ ๐Ÿ ๐‘ฟ ๐Ÿ bin ๐’€๐’‹ +1 ๐’ƒ ๐Ÿ +1 ๐’ƒ ๐Ÿ 1/0 +1 ๐’ƒ ๐Ÿ‘ ๐‘พ ๐Ÿ“ ๐‘พ ๐Ÿ” A B ๐‘พ ๐Ÿ ๐‘พ ๐Ÿ ๐‘พ ๐Ÿ‘ ๐‘พ ๐Ÿ’
  61. 61. Summary of Parameters Sum Of Products (SOP) ๐’” s=(๐‘ฟ ๐ŸŽ ๐‘พ ๐ŸŽ+๐‘ฟ ๐Ÿ ๐‘พ ๐Ÿ+๐‘ฟ ๐Ÿ ๐‘พ ๐Ÿ+โ€ฆ) ๐ŸŽ โ‰ค ฮท โ‰ค ๐Ÿ ๐‘ฟ(๐’)=(๐‘ฟ ๐ŸŽ, ๐‘ฟ ๐Ÿ,๐‘ฟ ๐Ÿ, โ€ฆ) W(๐’)=(๐‘พ ๐ŸŽ, ๐‘พ ๐Ÿ,๐‘พ ๐Ÿ, โ€ฆ) F(s)s ๐‘ฟ ๐Ÿ ๐‘ฟ ๐Ÿ bin ๐’€๐’‹ +1 ๐’ƒ ๐Ÿ +1 ๐’ƒ ๐Ÿ 1/0 +1 ๐’ƒ ๐Ÿ‘ ๐‘พ ๐Ÿ“ ๐‘พ ๐Ÿ” A B ๐‘พ ๐Ÿ ๐‘พ ๐Ÿ ๐‘พ ๐Ÿ‘ ๐‘พ ๐Ÿ’
  62. 62. Summary of Parameters Activation Function ๐’ƒ๐’Š๐’ s=(๐‘ฟ ๐ŸŽ ๐‘พ ๐ŸŽ+๐‘ฟ ๐Ÿ ๐‘พ ๐Ÿ+๐‘ฟ ๐Ÿ ๐‘พ ๐Ÿ+โ€ฆ) ๐ŸŽ โ‰ค ฮท โ‰ค ๐Ÿ ๐‘ฟ(๐’)=(๐‘ฟ ๐ŸŽ, ๐‘ฟ ๐Ÿ,๐‘ฟ ๐Ÿ, โ€ฆ) W(๐’)=(๐‘พ ๐ŸŽ, ๐‘พ ๐Ÿ,๐‘พ ๐Ÿ, โ€ฆ) F(s)s ๐‘ฟ ๐Ÿ ๐‘ฟ ๐Ÿ bin ๐’€๐’‹ +1 ๐’ƒ ๐Ÿ +1 ๐’ƒ ๐Ÿ 1/0 +1 ๐’ƒ ๐Ÿ‘ ๐‘พ ๐Ÿ“ ๐‘พ ๐Ÿ” A B ๐‘พ ๐Ÿ ๐‘พ ๐Ÿ ๐‘พ ๐Ÿ‘ ๐‘พ ๐Ÿ’
  63. 63. Summary of Parameters Outputs ๐’€๐’‹ F(s)s ๐‘ฟ ๐Ÿ ๐‘ฟ ๐Ÿ bin ๐’€๐’‹ +1 ๐’ƒ ๐Ÿ +1 ๐’ƒ ๐Ÿ 1/0 +1 ๐’ƒ ๐Ÿ‘ ๐‘พ ๐Ÿ“ ๐‘พ ๐Ÿ” A B ๐‘พ ๐Ÿ ๐‘พ ๐Ÿ ๐‘พ ๐Ÿ‘ ๐‘พ ๐Ÿ’ s=(๐‘ฟ ๐ŸŽ ๐‘พ ๐ŸŽ+๐‘ฟ ๐Ÿ ๐‘พ ๐Ÿ+๐‘ฟ ๐Ÿ ๐‘พ ๐Ÿ+โ€ฆ) ๐ŸŽ โ‰ค ฮท โ‰ค ๐Ÿ ๐‘ฟ(๐’)=(๐‘ฟ ๐ŸŽ, ๐‘ฟ ๐Ÿ,๐‘ฟ ๐Ÿ, โ€ฆ) W(๐’)=(๐‘พ ๐ŸŽ, ๐‘พ ๐Ÿ,๐‘พ ๐Ÿ, โ€ฆ)
  64. 64. Summary of Parameters Learning Rate ฮท ๐ŸŽ โ‰ค ฮท โ‰ค ๐Ÿ F(s)s ๐‘ฟ ๐Ÿ ๐‘ฟ ๐Ÿ bin ๐’€๐’‹ +1 ๐’ƒ ๐Ÿ +1 ๐’ƒ ๐Ÿ 1/0 +1 ๐’ƒ ๐Ÿ‘ ๐‘พ ๐Ÿ“ ๐‘พ ๐Ÿ” A B ๐‘พ ๐Ÿ ๐‘พ ๐Ÿ ๐‘พ ๐Ÿ‘ ๐‘พ ๐Ÿ’ s=(๐‘ฟ ๐ŸŽ ๐‘พ ๐ŸŽ+๐‘ฟ ๐Ÿ ๐‘พ ๐Ÿ+๐‘ฟ ๐Ÿ ๐‘พ ๐Ÿ+โ€ฆ) ๐‘ฟ(๐’)=(๐‘ฟ ๐ŸŽ, ๐‘ฟ ๐Ÿ,๐‘ฟ ๐Ÿ, โ€ฆ) W(๐’)=(๐‘พ ๐ŸŽ, ๐‘พ ๐Ÿ,๐‘พ ๐Ÿ, โ€ฆ)
  65. 65. Other Parameters Step n ๐’ = ๐ŸŽ, ๐Ÿ, ๐Ÿ, โ€ฆ F(s)s ๐‘ฟ ๐Ÿ ๐‘ฟ ๐Ÿ bin ๐’€๐’‹ +1 ๐’ƒ ๐Ÿ +1 ๐’ƒ ๐Ÿ 1/0 +1 ๐’ƒ ๐Ÿ‘ ๐‘พ ๐Ÿ“ ๐‘พ ๐Ÿ” A B ๐‘พ ๐Ÿ ๐‘พ ๐Ÿ ๐‘พ ๐Ÿ‘ ๐‘พ ๐Ÿ’ s=(๐‘ฟ ๐ŸŽ ๐‘พ ๐ŸŽ+๐‘ฟ ๐Ÿ ๐‘พ ๐Ÿ+๐‘ฟ ๐Ÿ ๐‘พ ๐Ÿ+โ€ฆ) ๐ŸŽ โ‰ค ฮท โ‰ค ๐Ÿ ๐‘ฟ(๐’)=(๐‘ฟ ๐ŸŽ, ๐‘ฟ ๐Ÿ,๐‘ฟ ๐Ÿ, โ€ฆ) W(๐’)=(๐‘พ ๐ŸŽ, ๐‘พ ๐Ÿ,๐‘พ ๐Ÿ, โ€ฆ)
  66. 66. Other Parameters Desired Output ๐’…๐’‹ ๐’ = ๐ŸŽ, ๐Ÿ, ๐Ÿ, โ€ฆ ๐’… ๐’ = ๐Ÿ, ๐’™ ๐’ ๐’ƒ๐’†๐’๐’๐’๐’ˆ๐’” ๐’•๐’ ๐‘ช๐Ÿ (๐Ÿ) ๐ŸŽ, ๐’™ ๐’ ๐’ƒ๐’†๐’๐’๐’๐’ˆ๐’” ๐’•๐’ ๐‘ช๐Ÿ (๐ŸŽ) BA 01 1 10 00 0 11 F(s)s ๐‘ฟ ๐Ÿ ๐‘ฟ ๐Ÿ bin ๐’€๐’‹ +1 ๐’ƒ ๐Ÿ +1 ๐’ƒ ๐Ÿ 1/0 +1 ๐’ƒ ๐Ÿ‘ ๐‘พ ๐Ÿ“ ๐‘พ ๐Ÿ” A B ๐‘พ ๐Ÿ ๐‘พ ๐Ÿ ๐‘พ ๐Ÿ‘ ๐‘พ ๐Ÿ’ s=(๐‘ฟ ๐ŸŽ ๐‘พ ๐ŸŽ+๐‘ฟ ๐Ÿ ๐‘พ ๐Ÿ+๐‘ฟ ๐Ÿ ๐‘พ ๐Ÿ+โ€ฆ) ๐ŸŽ โ‰ค ฮท โ‰ค ๐Ÿ ๐‘ฟ(๐’)=(๐‘ฟ ๐ŸŽ, ๐‘ฟ ๐Ÿ,๐‘ฟ ๐Ÿ, โ€ฆ) W(๐’)=(๐‘พ ๐ŸŽ, ๐‘พ ๐Ÿ,๐‘พ ๐Ÿ, โ€ฆ)
  67. 67. Neural Networks Training Steps Weights Initialization Inputs Application Sum of Inputs-Weights Products Activation Function Response Calculation Weights Adaptation Back to Step 2 1 2 3 4 5 6
  68. 68. Regarding 5th Step: Weights Adaptation โ€ข If the predicted output Y is not the same as the desired output d, then weights are to be adapted according to the following equation: ๐‘พ ๐’ + ๐Ÿ = ๐‘พ ๐’ + ฮท ๐’… ๐’ โˆ’ ๐’€ ๐’ ๐‘ฟ(๐’) Where ๐‘พ ๐’ = [๐’ƒ ๐’ , ๐‘พ ๐Ÿ(๐’), ๐‘พ ๐Ÿ(๐’), ๐‘พ ๐Ÿ‘(๐’), โ€ฆ , ๐‘พ ๐’Ž(๐’)]
  69. 69. Neural Networks Training Example Step n=0 โ€ข In each step in the solution, the parameters of the neural network must be known. โ€ข Parameters of step n=0: ฮท = .001 ๐‘‹ ๐‘› = ๐‘‹ 0 = +1, +1, +1,1, 0 ๐‘Š ๐‘› = ๐‘Š 0 = ๐‘1, ๐‘2, ๐‘3, ๐‘ค1, ๐‘ค2, ๐‘ค3, ๐‘ค4, ๐‘ค5, ๐‘ค6 = โˆ’1.5, โˆ’.5, โˆ’.5, 1, 1, 1, 1, โˆ’2, 1 ๐‘‘ ๐‘› = ๐‘‘ 0 = 1 BA 01 1 => 1 10 00 0 => 0 11
  70. 70. Neural Networks Training Example Step n=0 s ๐‘ฟ ๐Ÿ ๐‘ฟ ๐Ÿ ๐’€๐’‹ +1 โˆ’๐Ÿ. ๐Ÿ“ +1 โˆ’. ๐Ÿ“ 1/0 +1 โˆ’. ๐Ÿ“ โˆ’2 +๐Ÿ A B +๐Ÿ +๐Ÿ +๐Ÿ +๐Ÿ bin BA 01 1 => 1 10 00 0 => 0 11
  71. 71. Neural Networks Training Example Step n=0 โ€“ SOP โ€“ ๐‘บ ๐Ÿ ๐‘บ ๐Ÿ=(+๐Ÿ๐’ƒ ๐Ÿ+๐‘ฟ ๐Ÿ ๐‘พ ๐Ÿ+๐‘ฟ ๐Ÿ ๐‘พ ๐Ÿ‘) =+1*-1.5+1*1+0*1 =-.5 BA 01 1 => 1 10 00 0 => 0 11 s ๐‘ฟ ๐Ÿ ๐‘ฟ ๐Ÿ ๐’€๐’‹ +1 โˆ’๐Ÿ. ๐Ÿ“ +1 โˆ’. ๐Ÿ“ 1/0 +1 โˆ’. ๐Ÿ“ โˆ’2 +๐Ÿ A B +๐Ÿ +๐Ÿ +๐Ÿ +๐Ÿ bin
  72. 72. Neural Networks Training Example Step n=0 โ€“ Output โ€“ ๐‘บ ๐Ÿ ๐’€ ๐‘บ ๐Ÿ = = ๐‘ฉ๐‘ฐ๐‘ต ๐‘บ ๐Ÿ = ๐‘ฉ๐‘ฐ๐‘ต โˆ’. ๐Ÿ“ = ๐ŸŽ ๐’ƒ๐’Š๐’ ๐’” = +๐Ÿ, ๐’” โ‰ฅ ๐ŸŽ ๐ŸŽ, ๐’” < ๐ŸŽ BA 01 1 => 1 10 00 0 => 0 11 s ๐‘ฟ ๐Ÿ ๐‘ฟ ๐Ÿ ๐’€๐’‹ +1 โˆ’๐Ÿ. ๐Ÿ“ +1 โˆ’. ๐Ÿ“ 1/0 +1 โˆ’. ๐Ÿ“ โˆ’2 +๐Ÿ A B +๐Ÿ +๐Ÿ +๐Ÿ +๐Ÿ bin
  73. 73. Neural Networks Training Example Step n=0 โ€“ SOP โ€“ ๐‘บ ๐Ÿ ๐‘บ ๐Ÿ=(+๐Ÿ๐’ƒ ๐Ÿ+๐‘ฟ ๐Ÿ ๐‘พ ๐Ÿ+๐‘ฟ ๐Ÿ ๐‘พ ๐Ÿ’) =+1*-.5+1*1+0*1 =.5 BA 01 1 => 1 10 00 0 => 0 11 s ๐‘ฟ ๐Ÿ ๐‘ฟ ๐Ÿ ๐’€๐’‹ +1 โˆ’๐Ÿ. ๐Ÿ“ +1 โˆ’. ๐Ÿ“ 1/0 +1 โˆ’. ๐Ÿ“ โˆ’2 +๐Ÿ A B +๐Ÿ +๐Ÿ +๐Ÿ +๐Ÿ bin
  74. 74. Neural Networks Training Example Step n=0 โ€“ Output โ€“ ๐‘บ ๐Ÿ ๐’€ ๐‘บ2 = = ๐‘ฉ๐‘ฐ๐‘ต ๐‘บ2 = ๐‘ฉ๐‘ฐ๐‘ต . ๐Ÿ“ = 1 ๐’ƒ๐’Š๐’ ๐’” = +๐Ÿ, ๐’” โ‰ฅ ๐ŸŽ โˆ’๐Ÿ, ๐’” < ๐ŸŽ BA 01 1 => 1 10 00 0 => 0 11 s ๐‘ฟ ๐Ÿ ๐‘ฟ ๐Ÿ ๐’€๐’‹ +1 โˆ’๐Ÿ. ๐Ÿ“ +1 โˆ’. ๐Ÿ“ 1/0 +1 โˆ’. ๐Ÿ“ โˆ’2 +๐Ÿ A B +๐Ÿ +๐Ÿ +๐Ÿ +๐Ÿ bin
  75. 75. Neural Networks Training Example Step n=0 โ€“ SOP โ€“ ๐‘บ ๐Ÿ‘ ๐‘บ ๐Ÿ‘=(+๐Ÿ๐’ƒ ๐Ÿ‘+๐‘บ ๐Ÿ ๐‘พ ๐Ÿ“+๐‘บ ๐Ÿ ๐‘พ ๐Ÿ”) =+1*-.5+0*-2+1*1 =.5 BA 01 1 => 1 10 00 0 => 0 11 s ๐‘ฟ ๐Ÿ ๐‘ฟ ๐Ÿ ๐’€๐’‹ +1 โˆ’๐Ÿ. ๐Ÿ“ +1 โˆ’. ๐Ÿ“ 1/0 +1 โˆ’. ๐Ÿ“ โˆ’2 +๐Ÿ A B +๐Ÿ +๐Ÿ +๐Ÿ +๐Ÿ bin
  76. 76. Neural Networks Training Example Step n=0 โ€“ Output โ€“ ๐‘บ ๐Ÿ‘ ๐’€ ๐‘บ3 = = ๐‘ฉ๐‘ฐ๐‘ต ๐‘บ3 = ๐‘ฉ๐‘ฐ๐‘ต . ๐Ÿ“ = 1 ๐’ƒ๐’Š๐’ ๐’” = +๐Ÿ, ๐’” โ‰ฅ ๐ŸŽ ๐ŸŽ, ๐’” < ๐ŸŽ BA 01 1 => 1 10 00 0 => 0 11 s ๐‘ฟ ๐Ÿ ๐‘ฟ ๐Ÿ ๐’€๐’‹ +1 โˆ’๐Ÿ. ๐Ÿ“ +1 โˆ’. ๐Ÿ“ 1/0 +1 โˆ’. ๐Ÿ“ โˆ’2 +๐Ÿ A B +๐Ÿ +๐Ÿ +๐Ÿ +๐Ÿ bin
  77. 77. Neural Networks Training Example Step n=0 - Output ๐’€ ๐’ = ๐’€ ๐ŸŽ = ๐’€ ๐‘บ3 = 1 BA 01 1 => 1 10 00 0 => 0 11 s ๐‘ฟ ๐Ÿ ๐‘ฟ ๐Ÿ ๐’€๐’‹ +1 โˆ’๐Ÿ. ๐Ÿ“ +1 โˆ’. ๐Ÿ“ 1/0 +1 โˆ’. ๐Ÿ“ โˆ’2 +๐Ÿ A B +๐Ÿ +๐Ÿ +๐Ÿ +๐Ÿ bin
  78. 78. Neural Networks Training Example Step n=0 Predicted Vs. Desired ๐’€ ๐’ = ๐’€ ๐ŸŽ = 1 ๐ ๐’ = ๐’… ๐ŸŽ = 1 โˆต ๐’€ ๐’ = ๐’… ๐’ โˆด Weights are Correct. No Adaptation BA 01 1 => 1 10 00 0 => 0 11 s ๐‘ฟ ๐Ÿ ๐‘ฟ ๐Ÿ ๐’€๐’‹ +1 โˆ’๐Ÿ. ๐Ÿ“ +1 โˆ’. ๐Ÿ“ 1/0 +1 โˆ’. ๐Ÿ“ โˆ’2 +๐Ÿ A B +๐Ÿ +๐Ÿ +๐Ÿ +๐Ÿ bin
  79. 79. Neural Networks Training Example Step n=1 โ€ข In each step in the solution, the parameters of the neural network must be known. โ€ข Parameters of step n=1: ฮท = .001 ๐‘‹ ๐‘› = ๐‘‹ 1 = +1, +1, +1,0, 1 ๐‘Š ๐‘› = ๐‘Š 1 = ๐‘Š 0 = ๐‘1, ๐‘2, ๐‘3, ๐‘ค1, ๐‘ค1, ๐‘ค2, ๐‘ค3, ๐‘ค4, ๐‘ค5, ๐‘ค6 = โˆ’1.5, โˆ’.5, โˆ’.5, 1, 1, 1, 1, โˆ’2, 1 ๐‘‘ ๐‘› = ๐‘‘ 1 = +1 BA 01 1 => 1 10 00 0 => 0 11
  80. 80. Neural Networks Training Example Step n=1 s ๐‘ฟ ๐Ÿ ๐‘ฟ ๐Ÿ ๐’€๐’‹ +1 โˆ’๐Ÿ. ๐Ÿ“ +1 โˆ’. ๐Ÿ“ 1/0 +1 โˆ’. ๐Ÿ“ โˆ’2 +๐Ÿ A B +๐Ÿ +๐Ÿ +๐Ÿ +๐Ÿ bin BA 01 1 => 1 10 00 0 => 0 11
  81. 81. Neural Networks Training Example Step n=1 โ€“ SOP โ€“ ๐‘บ ๐Ÿ ๐‘บ ๐Ÿ=(+๐Ÿ๐’ƒ ๐Ÿ+๐‘ฟ ๐Ÿ ๐‘พ ๐Ÿ+๐‘ฟ ๐Ÿ ๐‘พ ๐Ÿ‘) =+1*-1.5+0*1+1*1 =-.5 BA 01 1 => 1 10 00 0 => 0 11 s ๐‘ฟ ๐Ÿ ๐‘ฟ ๐Ÿ ๐’€๐’‹ +1 โˆ’๐Ÿ. ๐Ÿ“ +1 โˆ’. ๐Ÿ“ 1/0 +1 โˆ’. ๐Ÿ“ โˆ’2 +๐Ÿ A B +๐Ÿ +๐Ÿ +๐Ÿ +๐Ÿ bin
  82. 82. Neural Networks Training Example Step n=1 โ€“ Output โ€“ ๐‘บ ๐Ÿ ๐’€ ๐‘บ ๐Ÿ = = ๐‘ฉ๐‘ฐ๐‘ต ๐‘บ ๐Ÿ = ๐‘ฉ๐‘ฐ๐‘ต โˆ’. ๐Ÿ“ = ๐ŸŽ ๐’ƒ๐’Š๐’ ๐’” = +๐Ÿ, ๐’” โ‰ฅ ๐ŸŽ ๐ŸŽ, ๐’” < ๐ŸŽ BA 01 1 => 1 10 00 0 => 0 11 s ๐‘ฟ ๐Ÿ ๐‘ฟ ๐Ÿ ๐’€๐’‹ +1 โˆ’๐Ÿ. ๐Ÿ“ +1 โˆ’. ๐Ÿ“ 1/0 +1 โˆ’. ๐Ÿ“ โˆ’2 +๐Ÿ A B +๐Ÿ +๐Ÿ +๐Ÿ +๐Ÿ bin
  83. 83. Neural Networks Training Example Step n=1 โ€“ SOP โ€“ ๐‘บ ๐Ÿ ๐‘บ ๐Ÿ=(+๐Ÿ๐’ƒ ๐Ÿ+๐‘ฟ ๐Ÿ ๐‘พ ๐Ÿ+๐‘ฟ ๐Ÿ ๐‘พ ๐Ÿ’) =+1*-.5+0*1+1*1 =.5 BA 01 1 => 1 10 00 0 => 0 11 s ๐‘ฟ ๐Ÿ ๐‘ฟ ๐Ÿ ๐’€๐’‹ +1 โˆ’๐Ÿ. ๐Ÿ“ +1 โˆ’. ๐Ÿ“ 1/0 +1 โˆ’. ๐Ÿ“ โˆ’2 +๐Ÿ A B +๐Ÿ +๐Ÿ +๐Ÿ +๐Ÿ bin
  84. 84. Neural Networks Training Example Step n=1 โ€“ Output โ€“ ๐‘บ ๐Ÿ ๐’€ ๐‘บ2 = = ๐‘ฉ๐‘ฐ๐‘ต ๐‘บ2 = ๐‘ฉ๐‘ฐ๐‘ต . ๐Ÿ“ = 1 ๐’ƒ๐’Š๐’ ๐’” = +๐Ÿ, ๐’” โ‰ฅ ๐ŸŽ โˆ’๐Ÿ, ๐’” < ๐ŸŽ BA 01 1 => 1 10 00 0 => 0 11 s ๐‘ฟ ๐Ÿ ๐‘ฟ ๐Ÿ ๐’€๐’‹ +1 โˆ’๐Ÿ. ๐Ÿ“ +1 โˆ’. ๐Ÿ“ 1/0 +1 โˆ’. ๐Ÿ“ โˆ’2 +๐Ÿ A B +๐Ÿ +๐Ÿ +๐Ÿ +๐Ÿ bin
  85. 85. Neural Networks Training Example Step n=1 โ€“ SOP โ€“ ๐‘บ ๐Ÿ‘ ๐‘บ ๐Ÿ‘=(+๐Ÿ๐’ƒ ๐Ÿ‘+๐‘บ ๐Ÿ ๐‘พ ๐Ÿ“+๐‘บ ๐Ÿ ๐‘พ ๐Ÿ”) =+1*-.5+0*-2+1*1 =.5 BA 01 1 => 1 10 00 0 => 0 11 s ๐‘ฟ ๐Ÿ ๐‘ฟ ๐Ÿ ๐’€๐’‹ +1 โˆ’๐Ÿ. ๐Ÿ“ +1 โˆ’. ๐Ÿ“ 1/0 +1 โˆ’. ๐Ÿ“ โˆ’2 +๐Ÿ A B +๐Ÿ +๐Ÿ +๐Ÿ +๐Ÿ bin
  86. 86. Neural Networks Training Example Step n=1 โ€“ Output โ€“ ๐‘บ ๐Ÿ‘ ๐’€ ๐‘บ3 = = ๐‘ฉ๐‘ฐ๐‘ต ๐‘บ3 = ๐‘ฉ๐‘ฐ๐‘ต . ๐Ÿ“ = 1 ๐’ƒ๐’Š๐’ ๐’” = +๐Ÿ, ๐’” โ‰ฅ ๐ŸŽ ๐ŸŽ, ๐’” < ๐ŸŽ BA 01 1 => 1 10 00 0 => 0 11 s ๐‘ฟ ๐Ÿ ๐‘ฟ ๐Ÿ ๐’€๐’‹ +1 โˆ’๐Ÿ. ๐Ÿ“ +1 โˆ’. ๐Ÿ“ 1/0 +1 โˆ’. ๐Ÿ“ โˆ’2 +๐Ÿ A B +๐Ÿ +๐Ÿ +๐Ÿ +๐Ÿ bin
  87. 87. Neural Networks Training Example Step n=1 - Output ๐’€ ๐’ = ๐’€ ๐Ÿ = ๐’€ ๐‘บ3 = 1 BA 01 1 => 1 10 00 0 => 0 11 s ๐‘ฟ ๐Ÿ ๐‘ฟ ๐Ÿ ๐’€๐’‹ +1 โˆ’๐Ÿ. ๐Ÿ“ +1 โˆ’. ๐Ÿ“ 1/0 +1 โˆ’. ๐Ÿ“ โˆ’2 +๐Ÿ A B +๐Ÿ +๐Ÿ +๐Ÿ +๐Ÿ bin
  88. 88. Neural Networks Training Example Step n=1 Predicted Vs. Desired ๐’€ ๐’ = ๐’€ ๐Ÿ = 1 ๐ ๐’ = ๐’… ๐Ÿ = 1 โˆต ๐’€ ๐’ = ๐’… ๐’ โˆด Weights are Correct. No Adaptation BA 01 1 => 1 10 00 0 => 0 11 s ๐‘ฟ ๐Ÿ ๐‘ฟ ๐Ÿ ๐’€๐’‹ +1 โˆ’๐Ÿ. ๐Ÿ“ +1 โˆ’. ๐Ÿ“ 1/0 +1 โˆ’. ๐Ÿ“ โˆ’2 +๐Ÿ A B +๐Ÿ +๐Ÿ +๐Ÿ +๐Ÿ bin
  89. 89. Neural Networks Training Example Step n=2 โ€ข In each step in the solution, the parameters of the neural network must be known. โ€ข Parameters of step n=2: ฮท = .001 ๐‘‹ ๐‘› = ๐‘‹ 2 = +1, +1, +1,0, 0 ๐‘Š ๐‘› = ๐‘Š 2 = ๐‘Š 1 = ๐‘1, ๐‘2, ๐‘3, ๐‘ค1, ๐‘ค1, ๐‘ค2, ๐‘ค3, ๐‘ค4, ๐‘ค5, ๐‘ค6 = โˆ’1.5, โˆ’.5, โˆ’.5, 1, 1, 1, 1, โˆ’2, 1 ๐‘‘ ๐‘› = ๐‘‘ 2 = 0 BA 01 1 => 1 10 00 0 => 0 11
  90. 90. Neural Networks Training Example Step n=2 s ๐‘ฟ ๐Ÿ ๐‘ฟ ๐Ÿ ๐’€๐’‹ +1 โˆ’๐Ÿ. ๐Ÿ“ +1 โˆ’. ๐Ÿ“ 1/0 +1 โˆ’. ๐Ÿ“ โˆ’๐Ÿ +๐Ÿ A B +๐Ÿ +๐Ÿ +๐Ÿ +๐Ÿ bin BA 01 1 => 1 10 00 0 => 0 11
  91. 91. Neural Networks Training Example Step n=2 โ€“ SOP โ€“ ๐‘บ ๐Ÿ ๐‘บ ๐Ÿ=(+๐Ÿ๐’ƒ ๐Ÿ+๐‘ฟ ๐Ÿ ๐‘พ ๐Ÿ+๐‘ฟ ๐Ÿ ๐‘พ ๐Ÿ‘) =+1*-1.5+0*1+0*1 =-1.5 BA 01 1 => 1 10 00 0 => 0 11 s ๐‘ฟ ๐Ÿ ๐‘ฟ ๐Ÿ ๐’€๐’‹ +1 โˆ’๐Ÿ. ๐Ÿ“ +1 โˆ’. ๐Ÿ“ 1/0 +1 โˆ’. ๐Ÿ“ โˆ’๐Ÿ +๐Ÿ A B +๐Ÿ +๐Ÿ +๐Ÿ +๐Ÿ bin
  92. 92. Neural Networks Training Example Step n=2 โ€“ Output โ€“ ๐‘บ ๐Ÿ ๐’€ ๐‘บ ๐Ÿ = = ๐‘ฉ๐‘ฐ๐‘ต ๐‘บ ๐Ÿ = ๐‘ฉ๐‘ฐ๐‘ต โˆ’๐Ÿ. ๐Ÿ“ = ๐ŸŽ ๐’ƒ๐’Šn ๐’” = +๐Ÿ, ๐’” โ‰ฅ ๐ŸŽ ๐ŸŽ, ๐’” < ๐ŸŽ BA 01 1 => 1 10 00 0 => 0 11 s ๐‘ฟ ๐Ÿ ๐‘ฟ ๐Ÿ ๐’€๐’‹ +1 โˆ’๐Ÿ. ๐Ÿ“ +1 โˆ’. ๐Ÿ“ 1/0 +1 โˆ’. ๐Ÿ“ โˆ’๐Ÿ +๐Ÿ A B +๐Ÿ +๐Ÿ +๐Ÿ +๐Ÿ bin
  93. 93. Neural Networks Training Example Step n=2 โ€“ SOP โ€“ ๐‘บ ๐Ÿ ๐‘บ ๐Ÿ=(+๐Ÿ๐’ƒ ๐Ÿ+๐‘ฟ ๐Ÿ ๐‘พ ๐Ÿ+๐‘ฟ ๐Ÿ ๐‘พ ๐Ÿ’) =+1*-.5+0*1+0*1 =-.5 BA 01 1 => 1 10 00 0 => 0 11 s ๐‘ฟ ๐Ÿ ๐‘ฟ ๐Ÿ ๐’€๐’‹ +1 โˆ’๐Ÿ. ๐Ÿ“ +1 โˆ’. ๐Ÿ“ 1/0 +1 โˆ’. ๐Ÿ“ โˆ’๐Ÿ +๐Ÿ A B +๐Ÿ +๐Ÿ +๐Ÿ +๐Ÿ bin
  94. 94. Neural Networks Training Example Step n=2 โ€“ Output โ€“ ๐‘บ ๐Ÿ ๐’€ ๐‘บ2 = = ๐‘บ๐‘ฎ๐‘ต ๐‘บ2 = ๐‘บ๐‘ฎ๐‘ต โˆ’. ๐Ÿ“ =0 ๐’ƒ๐’Š๐’ ๐’” = +๐Ÿ, ๐’” โ‰ฅ ๐ŸŽ โˆ’๐Ÿ, ๐’” < ๐ŸŽ BA 01 1 => 1 10 00 0 => 0 11 s ๐‘ฟ ๐Ÿ ๐‘ฟ ๐Ÿ ๐’€๐’‹ +1 โˆ’๐Ÿ. ๐Ÿ“ +1 โˆ’. ๐Ÿ“ 1/0 +1 โˆ’. ๐Ÿ“ โˆ’๐Ÿ +๐Ÿ A B +๐Ÿ +๐Ÿ +๐Ÿ +๐Ÿ bin
  95. 95. Neural Networks Training Example Step n=2 โ€“ SOP โ€“ ๐‘บ ๐Ÿ‘ ๐‘บ ๐Ÿ‘=(+๐Ÿ๐’ƒ ๐Ÿ‘+๐‘บ ๐Ÿ ๐‘พ ๐Ÿ“+๐‘บ ๐Ÿ ๐‘พ ๐Ÿ”) =+1*-.5+0*-2+0*1 =-.5 BA 01 1 => 1 10 00 0 => 0 11 s ๐‘ฟ ๐Ÿ ๐‘ฟ ๐Ÿ ๐’€๐’‹ +1 โˆ’๐Ÿ. ๐Ÿ“ +1 โˆ’. ๐Ÿ“ 1/0 +1 โˆ’. ๐Ÿ“ โˆ’๐Ÿ +๐Ÿ A B +๐Ÿ +๐Ÿ +๐Ÿ +๐Ÿ bin
  96. 96. Neural Networks Training Example Step n=2 โ€“ Output โ€“ ๐‘บ ๐Ÿ‘ ๐’€ ๐‘บ3 = = ๐‘ฉ๐‘ฐ๐‘ต ๐‘บ3 = ๐‘ฉ๐‘ฐ๐‘ต โˆ’. ๐Ÿ“ = ๐ŸŽ ๐’ƒ๐’Š๐’ ๐’” = +๐Ÿ, ๐’” โ‰ฅ ๐ŸŽ ๐ŸŽ, ๐’” < ๐ŸŽ BA 01 1 => 1 10 00 0 => 0 11 s ๐‘ฟ ๐Ÿ ๐‘ฟ ๐Ÿ ๐’€๐’‹ +1 โˆ’๐Ÿ. ๐Ÿ“ +1 โˆ’. ๐Ÿ“ 1/0 +1 โˆ’. ๐Ÿ“ โˆ’๐Ÿ +๐Ÿ A B +๐Ÿ +๐Ÿ +๐Ÿ +๐Ÿ bin
  97. 97. Neural Networks Training Example Step n=2 - Output ๐’€ ๐’ = ๐’€ ๐Ÿ = ๐’€ ๐‘บ3 = ๐ŸŽ BA 01 1 => 1 10 00 0 => 0 11 s ๐‘ฟ ๐Ÿ ๐‘ฟ ๐Ÿ ๐’€๐’‹ +1 โˆ’๐Ÿ. ๐Ÿ“ +1 โˆ’. ๐Ÿ“ 1/0 +1 โˆ’. ๐Ÿ“ โˆ’๐Ÿ +๐Ÿ A B +๐Ÿ +๐Ÿ +๐Ÿ +๐Ÿ bin
  98. 98. Neural Networks Training Example Step n=2 Predicted Vs. Desired ๐’€ ๐’ = ๐’€ ๐Ÿ = ๐ŸŽ ๐ ๐’ = ๐’… ๐Ÿ = ๐ŸŽ โˆต ๐’€ ๐’ = ๐’… ๐’ โˆด Weights are Correct. No Adaptation BA 01 1 => 1 10 00 0 => 0 11 s ๐‘ฟ ๐Ÿ ๐‘ฟ ๐Ÿ ๐’€๐’‹ +1 โˆ’๐Ÿ. ๐Ÿ“ +1 โˆ’. ๐Ÿ“ 1/0 +1 โˆ’. ๐Ÿ“ โˆ’๐Ÿ +๐Ÿ A B +๐Ÿ +๐Ÿ +๐Ÿ +๐Ÿ bin
  99. 99. Neural Networks Training Example Step n=3 โ€ข In each step in the solution, the parameters of the neural network must be known. โ€ข Parameters of step n=3: ฮท = .001 ๐‘‹ ๐‘› = ๐‘‹ 3 = +1, +1, +1,1, 1 ๐‘Š ๐‘› = ๐‘Š 3 = ๐‘Š 2 = ๐‘1, ๐‘2, ๐‘3, ๐‘ค1, ๐‘ค1, ๐‘ค2, ๐‘ค3, ๐‘ค4, ๐‘ค5, ๐‘ค6 = โˆ’1.5, โˆ’.5, โˆ’.5, 1, 1, 1, 1, โˆ’2, 1 ๐‘‘ ๐‘› = ๐‘‘ 3 = 0 BA 01 1 => 1 10 00 0 => 0 11
  100. 100. Neural Networks Training Example Step n=3 s ๐‘ฟ ๐Ÿ ๐‘ฟ ๐Ÿ ๐’€๐’‹ +1 โˆ’๐Ÿ. ๐Ÿ“ +1 โˆ’. ๐Ÿ“ 1/0 +1 โˆ’. ๐Ÿ“ โˆ’๐Ÿ +๐Ÿ A B +๐Ÿ +๐Ÿ +๐Ÿ +๐Ÿ bin BA 01 1 => 1 10 00 0 => 0 11
  101. 101. Neural Networks Training Example Step n=3 โ€“ SOP โ€“ ๐‘บ ๐Ÿ ๐‘บ ๐Ÿ=(+๐Ÿ๐’ƒ ๐Ÿ+๐‘ฟ ๐Ÿ ๐‘พ ๐Ÿ+๐‘ฟ ๐Ÿ ๐‘พ ๐Ÿ‘) =+1*-1.5+1*1+1*1 =.5 BA 01 1 => 1 10 00 0 => 0 11 s ๐‘ฟ ๐Ÿ ๐‘ฟ ๐Ÿ ๐’€๐’‹ +1 โˆ’๐Ÿ. ๐Ÿ“ +1 โˆ’. ๐Ÿ“ 1/0 +1 โˆ’. ๐Ÿ“ โˆ’๐Ÿ +๐Ÿ A B +๐Ÿ +๐Ÿ +๐Ÿ +๐Ÿ bin
  102. 102. Neural Networks Training Example Step n=3 โ€“ Output โ€“ ๐‘บ ๐Ÿ ๐’€ ๐‘บ ๐Ÿ = = ๐‘ฉ๐‘ฐ๐‘ต ๐‘บ ๐Ÿ = ๐‘ฉ๐‘ฐ๐‘ต . ๐Ÿ“ = 1 ๐’ƒ๐’Š๐’ ๐’” = +๐Ÿ, ๐’” โ‰ฅ ๐ŸŽ ๐ŸŽ, ๐’” < ๐ŸŽ BA 01 1 => 1 10 00 0 => 0 11 s ๐‘ฟ ๐Ÿ ๐‘ฟ ๐Ÿ ๐’€๐’‹ +1 โˆ’๐Ÿ. ๐Ÿ“ +1 โˆ’. ๐Ÿ“ 1/0 +1 โˆ’. ๐Ÿ“ โˆ’๐Ÿ +๐Ÿ A B +๐Ÿ +๐Ÿ +๐Ÿ +๐Ÿ bin
  103. 103. Neural Networks Training Example Step n=3 โ€“ SOP โ€“ ๐‘บ ๐Ÿ ๐‘บ ๐Ÿ=(+๐Ÿ๐’ƒ ๐Ÿ+๐‘ฟ ๐Ÿ ๐‘พ ๐Ÿ+๐‘ฟ ๐Ÿ ๐‘พ ๐Ÿ’) =+1*-.5+1*1+1*1 =1.5 BA 01 1 => 1 10 00 0 => 0 11 s ๐‘ฟ ๐Ÿ ๐‘ฟ ๐Ÿ ๐’€๐’‹ +1 โˆ’๐Ÿ. ๐Ÿ“ +1 โˆ’. ๐Ÿ“ 1/0 +1 โˆ’. ๐Ÿ“ โˆ’๐Ÿ +๐Ÿ A B +๐Ÿ +๐Ÿ +๐Ÿ +๐Ÿ bin
  104. 104. Neural Networks Training Example Step n=3 โ€“ Output โ€“ ๐‘บ ๐Ÿ ๐’€ ๐‘บ2 = = ๐‘ฉ๐‘ฐ๐‘ต ๐‘บ2 = ๐‘ฉ๐‘ฐ๐‘ต ๐Ÿ. ๐Ÿ“ = 1 ๐’ƒ๐’Š๐’ ๐’” = +๐Ÿ, ๐’” โ‰ฅ ๐ŸŽ โˆ’๐Ÿ, ๐’” < ๐ŸŽ BA 01 1 => 1 10 00 0 => 0 11 s ๐‘ฟ ๐Ÿ ๐‘ฟ ๐Ÿ ๐’€๐’‹ +1 โˆ’๐Ÿ. ๐Ÿ“ +1 โˆ’. ๐Ÿ“ 1/0 +1 โˆ’. ๐Ÿ“ โˆ’๐Ÿ +๐Ÿ A B +๐Ÿ +๐Ÿ +๐Ÿ +๐Ÿ bin
  105. 105. Neural Networks Training Example Step n=3 โ€“ SOP โ€“ ๐‘บ ๐Ÿ‘ ๐‘บ ๐Ÿ‘=(+๐Ÿ๐’ƒ ๐Ÿ‘+๐‘บ ๐Ÿ ๐‘พ ๐Ÿ“+๐‘บ ๐Ÿ ๐‘พ ๐Ÿ”) =+1*-.5+1*-2+1*1 =-1.5 BA 01 1 => 1 10 00 0 => 0 11 s ๐‘ฟ ๐Ÿ ๐‘ฟ ๐Ÿ ๐’€๐’‹ +1 โˆ’๐Ÿ. ๐Ÿ“ +1 โˆ’. ๐Ÿ“ 1/0 +1 โˆ’. ๐Ÿ“ โˆ’๐Ÿ +๐Ÿ A B +๐Ÿ +๐Ÿ +๐Ÿ +๐Ÿ bin
  106. 106. Neural Networks Training Example Step n=3 โ€“ Output โ€“ ๐‘บ ๐Ÿ‘ ๐’€ ๐‘บ3 = = ๐‘ฉ๐‘ฐ๐‘ต ๐‘บ3 = ๐‘ฉ๐‘ฐ๐‘ต โˆ’๐Ÿ. ๐Ÿ“ = ๐ŸŽ ๐’ƒ๐’Š๐’ ๐’” = +๐Ÿ, ๐’” โ‰ฅ ๐ŸŽ ๐ŸŽ, ๐’” < ๐ŸŽ BA 01 1 => 1 10 00 0 => 0 11 s ๐‘ฟ ๐Ÿ ๐‘ฟ ๐Ÿ ๐’€๐’‹ +1 โˆ’๐Ÿ. ๐Ÿ“ +1 โˆ’. ๐Ÿ“ 1/0 +1 โˆ’. ๐Ÿ“ โˆ’๐Ÿ +๐Ÿ A B +๐Ÿ +๐Ÿ +๐Ÿ +๐Ÿ bin
  107. 107. Neural Networks Training Example Step n=3 - Output ๐’€ ๐’ = ๐’€ ๐Ÿ‘ = ๐’€ ๐‘บ3 = ๐ŸŽ BA 01 1 => 1 10 00 0 => 0 11 s ๐‘ฟ ๐Ÿ ๐‘ฟ ๐Ÿ ๐’€๐’‹ +1 โˆ’๐Ÿ. ๐Ÿ“ +1 โˆ’. ๐Ÿ“ 1/0 +1 โˆ’. ๐Ÿ“ โˆ’๐Ÿ +๐Ÿ A B +๐Ÿ +๐Ÿ +๐Ÿ +๐Ÿ bin
  108. 108. Neural Networks Training Example Step n=3 Predicted Vs. Desired ๐’€ ๐’ = ๐’€ ๐Ÿ‘ = ๐ŸŽ ๐ ๐’ = ๐’… ๐Ÿ‘ = ๐ŸŽ โˆต ๐’€ ๐’ = ๐’… ๐’ โˆด Weights are Correct. No Adaptation BA 01 1 => 1 10 00 0 => 0 11 s ๐‘ฟ ๐Ÿ ๐‘ฟ ๐Ÿ ๐’€๐’‹ +1 โˆ’๐Ÿ. ๐Ÿ“ +1 โˆ’. ๐Ÿ“ 1/0 +1 โˆ’. ๐Ÿ“ โˆ’๐Ÿ +๐Ÿ A B +๐Ÿ +๐Ÿ +๐Ÿ +๐Ÿ bin
  109. 109. Final Weights s ๐‘ฟ ๐Ÿ ๐‘ฟ ๐Ÿ ๐’€๐’‹ +1 โˆ’๐Ÿ. ๐Ÿ“ +1 โˆ’. ๐Ÿ“ 1/0 +1 โˆ’. ๐Ÿ“ โˆ’๐Ÿ +๐Ÿ A B +๐Ÿ +๐Ÿ +๐Ÿ +๐Ÿ bin Current weights predicted the desired outputs.

ร—