Slideshare uses cookies to improve functionality and performance, and to provide you with relevant advertising. If you continue browsing the site, you agree to the use of cookies on this website. See our User Agreement and Privacy Policy.

Slideshare uses cookies to improve functionality and performance, and to provide you with relevant advertising. If you continue browsing the site, you agree to the use of cookies on this website. See our Privacy Policy and User Agreement for details.

Successfully reported this slideshow.

Like this presentation? Why not share!

- What to Upload to SlideShare by SlideShare 6582166 views
- Customer Code: Creating a Company C... by HubSpot 4897080 views
- Be A Great Product Leader (Amplify,... by Adam Nash 1097231 views
- Trillion Dollar Coach Book (Bill Ca... by Eric Schmidt 1283142 views
- APIdays Paris 2019 - Innovation @ s... by apidays 1547828 views
- A few thoughts on work life-balance by Wim Vanderbauwhede 1123237 views

85 views

Published on

Following the history of computing, learning from Ada Lovelace's monumental paper.

Published in:
Data & Analytics

No Downloads

Total views

85

On SlideShare

0

From Embeds

0

Number of Embeds

1

Shares

0

Downloads

0

Comments

0

Likes

2

No notes for slide

- 1. Copyright of every illustration belongs to Sydney Padua. Hooray for her great book, The Thrilling Adventures of Lovelace and Babbage. Her book gave me the insight to organize all these contents, so hooray on that too.
- 2. Copyright of every illustration belongs to Sydney Padua. Hooray for her great book, The Thrilling Adventures of Lovelace and Babbage. Her book gave me the insight to organize all these contents, so hooray on that too.
- 3. Augusta Ada King, Countess of Lovelace 1815~1852
- 4. Augusta Ada King, Countess of Lovelace 1815~1852
- 5. Augusta Ada King, Countess of Lovelace 1815~1852
- 6. George Gordon Byron 1788~1824
- 7. “Byronmania”
- 8. The Bishop of Old Patras Germanos Blesses the Flag of Revolution by Theodoros Vryzakis (1788~1824)
- 9. The Bishop of Old Patras Germanos Blesses the Flag of Revolution by Theodoros Vryzakis (1788~1824)
- 10. Anne Isabella Noel Byron 1792~1860
- 11. Anne Isabella Noel Byron 1792~1860
- 12. Anne Isabella Noel Byron 1792~1860
- 13. Anne Isabella Noel Byron 1792~1860
- 14. “The princess of parallelograms.”
- 15. “The princess of parallelograms.”
- 16. “The princess of parallelograms.”
- 17. Augusta Ada King, Countess of Lovelace (When she was 4)
- 18. Augusta Ada King, Countess of Lovelace (When she was 4)
- 19. Augusta Ada King, Countess of Lovelace (When she was 4)
- 20. Augusta Ada King, Countess of Lovelace (When she was 4)
- 21. Augusta Ada King, Countess of Lovelace (When she was 4)
- 22. Augusta Ada King, Countess of Lovelace (When she was 4)
- 23. Mary Somerville 1780~1872
- 24. Mary Somerville 1780~1872
- 25. “The Queen of Nineteenth-Century Science.”
- 26. Mary Somerville 1780~1872
- 27. Mary Somerville 1780~1872
- 28. Mary Somerville 1780~1872
- 29. Mary Somerville 1780~1872
- 30. Caroline Herschel 1750~1848
- 31. John Stuart Mill 1806~1873
- 32. John Stuart Mill 1806~1873
- 33. Mary Somerville 1780~1872
- 34. Augustus De Morgan 1806~1871
- 35. Lovelace’s sketch with De Morgan’s comments
- 36. “The potential to be an original mathematical investigator,”
- 37. “The potential to be an original mathematical investigator,”
- 38. George Boole 1815~1864
- 39. Charles Babbage 1791~1871
- 40. Charles Babbage 1791~1871
- 41. Charles Babbage 1791~1871
- 42. Charles Babbage 1791~1871
- 43. Charles Babbage 1791~1871
- 44. Michael Faraday 1791~1867
- 45. Charles Darwin 1809~1882
- 46. Alfred, Lord Tennyson 1809~1892
- 47. Alfred, Lord Tennyson 1809~1892
- 48. Charles Dickens 1812~1870
- 49. Florence Nightingale 1820~1910
- 50. Florence Nightingale 1820~1910
- 51. Difference Engine
- 52. Difference Engine
- 53. Newton's method of divided differences
- 54. Newton's method of divided differences
- 55. Newton's method of divided differences
- 56. “Degree decreasing when using divided differences” 𝒏 (𝒙 + 𝟏) 𝒏−𝒙 𝒏 1 𝑥 + 1 − 𝑥 = 1 2 (𝑥 + 1)2 −𝑥2 = 2𝑥 + 1 3 (𝑥 + 1)3−𝑥3 = 3𝑥2 + 3𝑥 + 1 ︙ ︙
- 57. “Degree decreasing when using divided differences” 𝒏 (𝒙 + 𝟏) 𝒏−𝒙 𝒏 1 𝑥 + 1 − 𝑥 = 1 2 (𝑥 + 1)2 −𝑥2 = 2𝑥 + 1 3 (𝑥 + 1)3−𝑥3 = 3𝑥2 + 3𝑥 + 1 ︙ ︙
- 58. “Degree decreasing when using divided differences” 𝒏 (𝒙 + 𝟏) 𝒏−𝒙 𝒏 1 𝑥 + 1 − 𝑥 = 1 2 (𝑥 + 1)2 −𝑥2 = 2𝑥 + 1 3 (𝑥 + 1)3−𝑥3 = 3𝑥2 + 3𝑥 + 1 ︙ ︙
- 59. “Degree decreasing when using divided differences” 𝒏 (𝒙 + 𝟏) 𝒏−𝒙 𝒏 1 𝑥 + 1 − 𝑥 = 1 2 (𝑥 + 1)2 −𝑥2 = 2𝑥 + 1 3 (𝑥 + 1)3−𝑥3 = 3𝑥2 + 3𝑥 + 1 ︙ ︙
- 60. “Degree decreasing when using divided differences” 𝒏 (𝒙 + 𝟏) 𝒏−𝒙 𝒏 1 𝑥 + 1 − 𝑥 = 1 2 (𝑥 + 1)2 −𝑥2 = 2𝑥 + 1 3 (𝑥 + 1)3−𝑥3 = 3𝑥2 + 3𝑥 + 1 ︙ ︙
- 61. Difference Engine
- 62. Difference Engine
- 63. Difference Engine
- 64. Plan Diagram of the Analytical Engine
- 65. Plan Diagram of the Analytical Engine
- 66. Plan Diagram of the Analytical Engine
- 67. Plan Diagram of the Analytical Engine
- 68. Luigi Federico Menabrea 1809~1896
- 69. Luigi Federico Menabrea 1809~1896
- 70. Sir Charles Wheatstone 1802~1875
- 71. Sir Charles Wheatstone 1802~1875
- 72. Sir Charles Wheatstone 1802~1875
- 73. “Whenever any result is sought by its aid, the question will then arise — By what course of calculation can these results be arrived at by the machine in the shortest time?”
- 74. “Whenever any result is sought by its aid, the question will then arise — By what course of calculation can these results be arrived at by the machine in the shortest time?”
- 75. “Whenever any result is sought by its aid, the question will then arise — By what course of calculation can these results be arrived at by the machine in the shortest time?”
- 76. “Whenever any result is sought by its aid, the question will then arise — By what course of calculation can these results be arrived at by the machine in the shortest time?”
- 77. Every function can be calculated when continued ad infinitum. Operation is any process which alters the mutual relation of two or more things. Operations are homogeneous, but distributed amongst different subjects of operation. By what course of calculation can those results be arrived in the shortest time?
- 78. Every function can be calculated when continued ad infinitum. Operation is any process which alters the mutual relation of two or more things. Operations are homogeneous, but distributed amongst different subjects of operation. By what course of calculation can those results be arrived in the shortest time?
- 79. Every function can be calculated when continued ad infinitum. Operation is any process which alters the mutual relation of two or more things. Operations are homogeneous, but distributed amongst different subjects of operation. By what course of calculation can those results be arrived in the shortest time?
- 80. Every function can be calculated when continued ad infinitum. Operation is any process which alters the mutual relation of two or more things. Operations are homogeneous, but distributed amongst different subjects of operation. By what course of calculation can those results be arrived in the shortest time?
- 81. Every function can be calculated when continued ad infinitum. Operation is any process which alters the mutual relation of two or more things. Operations are homogeneous, but distributed amongst different subjects of operation. By what course of calculation can those results be arrived in the shortest time?
- 82. Augustus De Morgan 1806~1871
- 83. Augustus De Morgan 1806~1871
- 84. Augustus De Morgan 1806~1871
- 85. De Morgan’s Laws 𝐴 ∪ 𝐵 = ҧ𝐴 ∩ ത𝐵 𝐴 ∩ 𝐵 = ҧ𝐴 ∪ ത𝐵 ¬ 𝑃⋁𝑄 = ¬𝑃⋀¬𝑄 ¬ 𝑃⋀𝑄 = ¬𝑃⋁¬𝑄 Set Theory Formal Language
- 86. De Morgan’s Laws 𝐴 ∪ 𝐵 = ҧ𝐴 ∩ ത𝐵 𝐴 ∩ 𝐵 = ҧ𝐴 ∪ ത𝐵 ¬ 𝑃⋁𝑄 = ¬𝑃⋀¬𝑄 ¬ 𝑃⋀𝑄 = ¬𝑃⋁¬𝑄 Set Theory Formal Language
- 87. George Boole 1815~1864
- 88. Boole's House and School 3 Pottergate in Lincoln
- 89. Boole's House and School 3 Pottergate in Lincoln
- 90. Boole's House and School 3 Pottergate in Lincoln
- 91. George Boole 1815~1864
- 92. Gottfried Wilhelm Leibniz 1646~1716
- 93. Sir William Rowan Hamilton 1805~1865
- 94. Sir William Rowan Hamilton 1805~1865
- 95. Sir William Rowan Hamilton 1805~1865
- 96. Sir William Rowan Hamilton 1805~1865
- 97. Joseph Hill’s letter (1851) Recalling the meeting of Boole and Babbage
- 98. Joseph Hill’s letter (1851) Recalling the meeting of Boole and Babbage As Boole had discovered that means of reasoning might be conducted by a mathematical process, and Babbage had invented a machine for the performance of mathematical work, the two great men together seemed to have taken steps towards the construction of that great prodigy a Thinking Machine.
- 99. Joseph Hill’s letter (1851) Recalling the meeting of Boole and Babbage As Boole had discovered that means of reasoning might be conducted by a mathematical process, and Babbage had invented a machine for the performance of mathematical work, the two great men together seemed to have taken steps towards the construction of that great prodigy a Thinking Machine.
- 100. Joseph Hill’s letter (1851) Recalling the meeting of Boole and Babbage As Boole had discovered that means of reasoning might be conducted by a mathematical process, and Babbage had invented a machine for the performance of mathematical work, the two great men together seemed to have taken steps towards the construction of that great prodigy a Thinking Machine.
- 101. Joseph Hill’s letter (1851) Recalling the meeting of Boole and Babbage As Boole had discovered that means of reasoning might be conducted by a mathematical process, and Babbage had invented a machine for the performance of mathematical work, the two great men together seemed to have taken steps towards the construction of that great prodigy a Thinking Machine.
- 102. Claude Shannon 1916~2001
- 103. Claude Shannon 1916~2001
- 104. “… it just happened that no one else was familiar with both fields at the same time.”
- 105. “… it just happened that no one else was familiar with both fields at the same time.”
- 106. Switch representation of AND, OR and NOT function
- 107. Switch representation of AND, OR and NOT function
- 108. Switch representation of AND, OR and NOT function
- 109. Howard Hathaway Aiken 1900~1973
- 110. Harvard Mark I
- 111. Harvard Mark I
- 112. Harvard Mark I
- 113. “… felt like Babbage was addressing me personally from the past.”
- 114. “… felt like Babbage was addressing me personally from the past.”
- 115. Harvard Architecture
- 116. Harvard Architecture
- 117. Grace Hopper 1906~1992
- 118. Grace Hopper 1906~1992
- 119. First computer bug while working on Mark II (1947)
- 120. First computer bug while working on Mark II (1947)
- 121. First computer bug while working on Mark II (1947)
- 122. John von Neumann 1903~1957
- 123. John von Neumann 1903~1957
- 124. John von Neumann 1903~1957
- 125. Von Neumann Architecture Harvard Architecture One memory simplifies design. But one bus acts as a bottleneck. Two buses are expensive. Control unit is tricky to develop. One unified cache. Two separate cache. Freedom to organize memory. But could be error prone. Free data memory not utilized. H/W accelerated parallel execution impossible. Only simulated by S/W. Parallel access to data and instruction.
- 126. Von Neumann Architecture Harvard Architecture One memory simplifies design. But one bus acts as a bottleneck. Two buses are expensive. Control unit is tricky to develop. One unified cache. Two separate cache. Freedom to organize memory. But could be error prone. Free data memory not utilized. H/W accelerated parallel execution impossible. Only simulated by S/W. Parallel access to data and instruction.
- 127. Boolean algebra gives operations of reasoning in the symbolical language of calculus. Prepositions are variables. Yes/No in Boolean logic is now 1/0 in circuitry. Variables are just signals before we chose to look at it as variables.
- 128. Boolean algebra gives operations of reasoning in the symbolical language of calculus. Prepositions are variables. Yes/No in Boolean logic is now 1/0 in circuitry. Variables are just signals before we chose to look at it as variables.
- 129. Boolean algebra gives operations of reasoning in the symbolical language of calculus. Prepositions are variables. Yes/No in Boolean logic is now 1/0 in circuitry. Variables are just signals before we chose to look at it as variables.
- 130. Boolean algebra gives operations of reasoning in the symbolical language of calculus. Prepositions are variables. Yes/No in Boolean logic is now 1/0 in circuitry. Variables are just signals before we chose to look at it as variables.
- 131. Boolean algebra gives operations of reasoning in the symbolical language of calculus. Prepositions are variables. Yes/No in Boolean logic is now 1/0 in circuitry. Variables are just signals before we chose to look at it as variables.
- 132. Boolean algebra gives operations of reasoning in the symbolical language of calculus. Prepositions are variables. Yes/No in Boolean logic is now 1/0 in circuitry. Variables are just signals before we chose to look at it as variables.
- 133. Boolean algebra gives operations of reasoning in the symbolical language of calculus. Prepositions are variables. Yes/No in Boolean logic is now 1/0 in circuitry. Variables are just signals before we chose to look at it as variables.
- 134. Claude Shannon 1916~2001
- 135. “The fundamental problem of communication is that of reproducing at one point from a message selected at another point.”
- 136. 𝑁 bits can represent 2 𝑁 numbers. = 𝐶 number of data can be represented by log2 𝐶 bits.
- 137. 𝑁 bits can represent 2 𝑁 numbers. = 𝐶 number of data can be represented by log2 𝐶 bits.
- 138. Shannon entropy for a biased coin
- 139. Shannon entropy for a biased coin
- 140. Shannon entropy for a biased coin
- 141. Shannon entropy for a biased coin
- 142. Shannon entropy for a biased coin
- 143. Shannon entropy for a biased coin
- 144. Shannon entropy for a biased coin
- 145. Shannon entropy for a biased coin
- 146. Shannon entropy for a biased coin
- 147. Shannon entropy for a biased coin
- 148. “Entropy is the average rate at which information is produced by a stochastic source of data.”
- 149. “Entropy is the average rate at which information is produced by a stochastic source of data.”
- 150. “Lower uniform probability means more bits needed, so more information.”
- 151. “Lower uniform probability means more bits needed, so more information.”
- 152. “Lower uniform probability means more bits needed, so more information.”
- 153. “Lower uniform probability means more bits needed, so more information.”
- 154. John von Neumann 1903~1957
- 155. “You should call it entropy, for two reasons. In the first place your uncertainty function has been used in statistical mechanics under that name, so it already has a name. In the second place, and more important, nobody knows what entropy really is, so in a debate you will always have the advantage.”
- 156. “You should call it entropy, for two reasons. In the first place your uncertainty function has been used in statistical mechanics under that name, so it already has a name. In the second place, and more important, nobody knows what entropy really is, so in a debate you will always have the advantage.”
- 157. “You should call it entropy, for two reasons. In the first place your uncertainty function has been used in statistical mechanics under that name, so it already has a name. In the second place, and more important, nobody knows what entropy really is, so in a debate you will always have the advantage.”
- 158. “You should call it entropy, for two reasons. In the first place your uncertainty function has been used in statistical mechanics under that name, so it already has a name. In the second place, and more important, nobody knows what entropy really is, so in a debate you will always have the advantage.”
- 159. The 2nd Law of Thermodynamics Total entropy of an isolated system can never decrease over time.
- 160. The 2nd Law of Thermodynamics Total entropy of an isolated system can never decrease over time.
- 161. The 2nd Law of Thermodynamics Total entropy of an isolated system can never decrease over time.
- 162. The 2nd Law of Thermodynamics Total entropy of an isolated system can never decrease over time.
- 163. Ludwig Boltzmann 1844~1906
- 164. Ludwig Boltzmann 1844~1906
- 165. Entropy of Gases
- 166. Entropy of Gases
- 167. Entropy of Gases
- 168. Entropy of Gases
- 169. Entropy of Gases
- 170. Entropy of Gases
- 171. Entropy of Gases
- 172. Entropy of Gases
- 173. Entropy of Gases
- 174. Entropy of Gases
- 175. Entropy of Gases
- 176. Entropy of Gases
- 177. Entropy of Gases
- 178. Entropy of Gases
- 179. 𝐻 𝑦, ො𝑦 = 𝑖 𝑦𝑖 log 1 ෝ𝑦𝑖 = − 𝑖 𝑦𝑖 log ෝ𝑦𝑖 Cross Entropy Encoding the i-th symbol wrongly with log ෝ𝑦𝑖 bits. Therefore always larger than entropy.
- 180. 𝐻 𝑦, ො𝑦 = 𝑖 𝑦𝑖 log 1 ෝ𝑦𝑖 = − 𝑖 𝑦𝑖 log ෝ𝑦𝑖 Cross Entropy Encoding the i-th symbol wrongly with log ෝ𝑦𝑖 bits. Therefore always larger than entropy.
- 181. 𝐻 𝑦, ො𝑦 = 𝑖 𝑦𝑖 log 1 ෝ𝑦𝑖 = − 𝑖 𝑦𝑖 log ෝ𝑦𝑖 Cross Entropy Encoding the i-th symbol wrongly with log ෝ𝑦𝑖 bits. Therefore always larger than entropy.
- 182. 𝐻 𝑦, ො𝑦 = 𝑖 𝑦𝑖 log 1 ෝ𝑦𝑖 = − 𝑖 𝑦𝑖 log ෝ𝑦𝑖 Cross Entropy Encoding the i-th symbol wrongly with log ෝ𝑦𝑖 bits. Therefore always larger than entropy.
- 183. 𝐻 𝑦, ො𝑦 = 𝑖 𝑦𝑖 log 1 ෝ𝑦𝑖 = − 𝑖 𝑦𝑖 log ෝ𝑦𝑖 Cross Entropy Encoding the i-th symbol wrongly with log ෝ𝑦𝑖 bits. Therefore always larger than entropy.
- 184. 𝐻 𝑦, ො𝑦 = 𝑖 𝑦𝑖 log 1 ෝ𝑦𝑖 = − 𝑖 𝑦𝑖 log ෝ𝑦𝑖 Cross Entropy Encoding the i-th symbol wrongly with log ෝ𝑦𝑖 bits. Therefore always larger than entropy.
- 185. 𝐻 𝑦, ො𝑦 = 𝑖 𝑦𝑖 log 1 ෝ𝑦𝑖 = − 𝑖 𝑦𝑖 log ෝ𝑦𝑖 Cross Entropy Encoding the i-th symbol wrongly with log ෝ𝑦𝑖 bits. Therefore always larger than entropy.
- 186. 𝐻 𝑦, ො𝑦 = 𝑖 𝑦𝑖 log 1 ෝ𝑦𝑖 = − 𝑖 𝑦𝑖 log ෝ𝑦𝑖 Cross Entropy Encoding the i-th symbol wrongly with log ෝ𝑦𝑖 bits. Therefore always larger than entropy.
- 187. 𝐻 𝑦, ො𝑦 = 𝑖 𝑦𝑖 log 1 ෝ𝑦𝑖 = − 𝑖 𝑦𝑖 log ෝ𝑦𝑖 Cross Entropy Encoding the i-th symbol wrongly with log ෝ𝑦𝑖 bits. Therefore always larger than entropy.
- 188. 𝐻 𝑦, ො𝑦 = 𝑖 𝑦𝑖 log 1 ෝ𝑦𝑖 = − 𝑖 𝑦𝑖 log ෝ𝑦𝑖 Cross Entropy Encoding the i-th symbol wrongly with log ෝ𝑦𝑖 bits. Therefore always larger than entropy.
- 189. 𝐻 𝑦, ො𝑦 = 𝑖 𝑦𝑖 log 1 ෝ𝑦𝑖 = − 𝑖 𝑦𝑖 log ෝ𝑦𝑖 Cross Entropy Encoding the i-th symbol wrongly with log ෝ𝑦𝑖 bits. Therefore always larger than entropy.
- 190. 𝐻 𝑦, ො𝑦 = 𝑖 𝑦𝑖 log 1 ෝ𝑦𝑖 = − 𝑖 𝑦𝑖 log ෝ𝑦𝑖 Cross Entropy Encoding the i-th symbol wrongly with log ෝ𝑦𝑖 bits. Therefore always larger than entropy.
- 191. KL Divergence Number of extra bits needed if wrongly encoded. Minimizing KL divergence is same with minimizing cross entropy. 𝐾𝐿(𝑦| ො𝑦 = 𝐻 𝑦, ො𝑦 − 𝐻(𝑦, 𝑦) = 𝑖 𝑦𝑖 log 𝑦𝑖 ෝ𝑦𝑖
- 192. KL Divergence Number of extra bits needed if wrongly encoded. Minimizing KL divergence is same with minimizing cross entropy. 𝐾𝐿(𝑦| ො𝑦 = 𝐻 𝑦, ො𝑦 − 𝐻(𝑦, 𝑦) = 𝑖 𝑦𝑖 log 𝑦𝑖 ෝ𝑦𝑖
- 193. KL Divergence Number of extra bits needed if wrongly encoded. Minimizing KL divergence is same with minimizing cross entropy. 𝐾𝐿(𝑦| ො𝑦 = 𝐻 𝑦, ො𝑦 − 𝐻(𝑦, 𝑦) = 𝑖 𝑦𝑖 log 𝑦𝑖 ෝ𝑦𝑖
- 194. KL Divergence Number of extra bits needed if wrongly encoded. Minimizing KL divergence is same with minimizing cross entropy. 𝐾𝐿(𝑦| ො𝑦 = 𝐻 𝑦, ො𝑦 − 𝐻(𝑦, 𝑦) = 𝑖 𝑦𝑖 log 𝑦𝑖 ෝ𝑦𝑖
- 195. KL Divergence Number of extra bits needed if wrongly encoded. Minimizing KL divergence is same with minimizing cross entropy. 𝐾𝐿(𝑦| ො𝑦 = 𝐻 𝑦, ො𝑦 − 𝐻(𝑦, 𝑦) = 𝑖 𝑦𝑖 log 𝑦𝑖 ෝ𝑦𝑖
- 196. “Entropy is the average rate at which information is produced by a stochastic source of data.”
- 197. “Entropy is the average rate at which information is produced by a stochastic source of data.”
- 198. “Entropy is the average rate at which information is produced by a stochastic source of data.”
- 199. Everything is information.
- 200. Everything is information.
- 201. Entropy is the average rate at which information is produced by a stochastic source of data. Basically, everything is information. We are the one who is operating it.
- 202. Entropy is the average rate at which information is produced by a stochastic source of data. Basically, everything is information. We are the one who is operating it.
- 203. Entropy is the average rate at which information is produced by a stochastic source of data. Basically, everything is information. We are the one who is operating it.
- 204. Entropy is the average rate at which information is produced by a stochastic source of data. Basically, everything is information. We are the one who is operating it.
- 205. Entropy is the average rate at which information is produced by a stochastic source of data. Basically, everything is information. We are the one who is operating it.
- 206. Entropy is the average rate at which information is produced by a stochastic source of data. Basically, everything is information. We are the one who is operating it.
- 207. Gödel's theorem and Systems of Logic Based on Ordinals. Turing machine and Halting problem. Church-Turing Conjecture and Lambda Calculus. “Lady Lovelace’s Objection” and Turing test.
- 208. Gödel's theorem and Systems of Logic Based on Ordinals. Turing machine and Halting problem. Church-Turing Conjecture and Lambda Calculus. “Lady Lovelace’s Objection” and Turing test.
- 209. Gödel's theorem and Systems of Logic Based on Ordinals. Turing machine and Halting problem. Church-Turing Conjecture and Lambda Calculus. “Lady Lovelace’s Objection” and Turing test.
- 210. Gödel's theorem and Systems of Logic Based on Ordinals. Turing machine and Halting problem. Church-Turing Conjecture and Lambda Calculus. “Lady Lovelace’s Objection” and Turing test.
- 211. Gödel's theorem and Systems of Logic Based on Ordinals. Turing machine and Halting problem. Church-Turing Conjecture and Lambda Calculus. “Lady Lovelace’s Objection” and Turing test.
- 212. Gödel's theorem and Systems of Logic Based on Ordinals. Turing machine and Halting problem. Church-Turing Conjecture and Lambda Calculus. “Lady Lovelace’s Objection” and Turing test.
- 213. Gödel's theorem and Systems of Logic Based on Ordinals. Turing machine and Halting problem. Church-Turing Conjecture and Lambda Calculus. “Lady Lovelace’s Objection” and Turing test.
- 214. Gödel's theorem and Systems of Logic Based on Ordinals. Turing machine and Halting problem. Church-Turing Conjecture and Lambda Calculus. “Lady Lovelace’s Objection” and Turing test.
- 215. Gödel's theorem and Systems of Logic Based on Ordinals. Turing machine and Halting problem. Church-Turing Conjecture and Lambda Calculus. “Lady Lovelace’s Objection” and Turing test.
- 216. Will you give me poetical philosophy, poetical science?
- 217. Will you give me poetical philosophy, poetical science?
- 218. Will you give me poetical philosophy, poetical science?

No public clipboards found for this slide

Be the first to comment