Successfully reported this slideshow.
We use your LinkedIn profile and activity data to personalize ads and to show you more relevant ads. You can change your ad preferences anytime.

Teachings of Ada Lovelace

85 views

Published on

Following the history of computing, learning from Ada Lovelace's monumental paper.

Published in: Data & Analytics
  • Be the first to comment

Teachings of Ada Lovelace

  1. 1. Copyright of every illustration belongs to Sydney Padua. Hooray for her great book, The Thrilling Adventures of Lovelace and Babbage. Her book gave me the insight to organize all these contents, so hooray on that too.
  2. 2. Copyright of every illustration belongs to Sydney Padua. Hooray for her great book, The Thrilling Adventures of Lovelace and Babbage. Her book gave me the insight to organize all these contents, so hooray on that too.
  3. 3. Augusta Ada King, Countess of Lovelace 1815~1852
  4. 4. Augusta Ada King, Countess of Lovelace 1815~1852
  5. 5. Augusta Ada King, Countess of Lovelace 1815~1852
  6. 6. George Gordon Byron 1788~1824
  7. 7. “Byronmania”
  8. 8. The Bishop of Old Patras Germanos Blesses the Flag of Revolution by Theodoros Vryzakis (1788~1824)
  9. 9. The Bishop of Old Patras Germanos Blesses the Flag of Revolution by Theodoros Vryzakis (1788~1824)
  10. 10. Anne Isabella Noel Byron 1792~1860
  11. 11. Anne Isabella Noel Byron 1792~1860
  12. 12. Anne Isabella Noel Byron 1792~1860
  13. 13. Anne Isabella Noel Byron 1792~1860
  14. 14. “The princess of parallelograms.”
  15. 15. “The princess of parallelograms.”
  16. 16. “The princess of parallelograms.”
  17. 17. Augusta Ada King, Countess of Lovelace (When she was 4)
  18. 18. Augusta Ada King, Countess of Lovelace (When she was 4)
  19. 19. Augusta Ada King, Countess of Lovelace (When she was 4)
  20. 20. Augusta Ada King, Countess of Lovelace (When she was 4)
  21. 21. Augusta Ada King, Countess of Lovelace (When she was 4)
  22. 22. Augusta Ada King, Countess of Lovelace (When she was 4)
  23. 23. Mary Somerville 1780~1872
  24. 24. Mary Somerville 1780~1872
  25. 25. “The Queen of Nineteenth-Century Science.”
  26. 26. Mary Somerville 1780~1872
  27. 27. Mary Somerville 1780~1872
  28. 28. Mary Somerville 1780~1872
  29. 29. Mary Somerville 1780~1872
  30. 30. Caroline Herschel 1750~1848
  31. 31. John Stuart Mill 1806~1873
  32. 32. John Stuart Mill 1806~1873
  33. 33. Mary Somerville 1780~1872
  34. 34. Augustus De Morgan 1806~1871
  35. 35. Lovelace’s sketch with De Morgan’s comments
  36. 36. “The potential to be an original mathematical investigator,”
  37. 37. “The potential to be an original mathematical investigator,”
  38. 38. George Boole 1815~1864
  39. 39. Charles Babbage 1791~1871
  40. 40. Charles Babbage 1791~1871
  41. 41. Charles Babbage 1791~1871
  42. 42. Charles Babbage 1791~1871
  43. 43. Charles Babbage 1791~1871
  44. 44. Michael Faraday 1791~1867
  45. 45. Charles Darwin 1809~1882
  46. 46. Alfred, Lord Tennyson 1809~1892
  47. 47. Alfred, Lord Tennyson 1809~1892
  48. 48. Charles Dickens 1812~1870
  49. 49. Florence Nightingale 1820~1910
  50. 50. Florence Nightingale 1820~1910
  51. 51. Difference Engine
  52. 52. Difference Engine
  53. 53. Newton's method of divided differences
  54. 54. Newton's method of divided differences
  55. 55. Newton's method of divided differences
  56. 56. “Degree decreasing when using divided differences” 𝒏 (𝒙 + 𝟏) 𝒏−𝒙 𝒏 1 𝑥 + 1 − 𝑥 = 1 2 (𝑥 + 1)2 −𝑥2 = 2𝑥 + 1 3 (𝑥 + 1)3−𝑥3 = 3𝑥2 + 3𝑥 + 1 ︙ ︙
  57. 57. “Degree decreasing when using divided differences” 𝒏 (𝒙 + 𝟏) 𝒏−𝒙 𝒏 1 𝑥 + 1 − 𝑥 = 1 2 (𝑥 + 1)2 −𝑥2 = 2𝑥 + 1 3 (𝑥 + 1)3−𝑥3 = 3𝑥2 + 3𝑥 + 1 ︙ ︙
  58. 58. “Degree decreasing when using divided differences” 𝒏 (𝒙 + 𝟏) 𝒏−𝒙 𝒏 1 𝑥 + 1 − 𝑥 = 1 2 (𝑥 + 1)2 −𝑥2 = 2𝑥 + 1 3 (𝑥 + 1)3−𝑥3 = 3𝑥2 + 3𝑥 + 1 ︙ ︙
  59. 59. “Degree decreasing when using divided differences” 𝒏 (𝒙 + 𝟏) 𝒏−𝒙 𝒏 1 𝑥 + 1 − 𝑥 = 1 2 (𝑥 + 1)2 −𝑥2 = 2𝑥 + 1 3 (𝑥 + 1)3−𝑥3 = 3𝑥2 + 3𝑥 + 1 ︙ ︙
  60. 60. “Degree decreasing when using divided differences” 𝒏 (𝒙 + 𝟏) 𝒏−𝒙 𝒏 1 𝑥 + 1 − 𝑥 = 1 2 (𝑥 + 1)2 −𝑥2 = 2𝑥 + 1 3 (𝑥 + 1)3−𝑥3 = 3𝑥2 + 3𝑥 + 1 ︙ ︙
  61. 61. Difference Engine
  62. 62. Difference Engine
  63. 63. Difference Engine
  64. 64. Plan Diagram of the Analytical Engine
  65. 65. Plan Diagram of the Analytical Engine
  66. 66. Plan Diagram of the Analytical Engine
  67. 67. Plan Diagram of the Analytical Engine
  68. 68. Luigi Federico Menabrea 1809~1896
  69. 69. Luigi Federico Menabrea 1809~1896
  70. 70. Sir Charles Wheatstone 1802~1875
  71. 71. Sir Charles Wheatstone 1802~1875
  72. 72. Sir Charles Wheatstone 1802~1875
  73. 73. “Whenever any result is sought by its aid, the question will then arise — By what course of calculation can these results be arrived at by the machine in the shortest time?”
  74. 74. “Whenever any result is sought by its aid, the question will then arise — By what course of calculation can these results be arrived at by the machine in the shortest time?”
  75. 75. “Whenever any result is sought by its aid, the question will then arise — By what course of calculation can these results be arrived at by the machine in the shortest time?”
  76. 76. “Whenever any result is sought by its aid, the question will then arise — By what course of calculation can these results be arrived at by the machine in the shortest time?”
  77. 77. Every function can be calculated when continued ad infinitum. Operation is any process which alters the mutual relation of two or more things. Operations are homogeneous, but distributed amongst different subjects of operation. By what course of calculation can those results be arrived in the shortest time?
  78. 78. Every function can be calculated when continued ad infinitum. Operation is any process which alters the mutual relation of two or more things. Operations are homogeneous, but distributed amongst different subjects of operation. By what course of calculation can those results be arrived in the shortest time?
  79. 79. Every function can be calculated when continued ad infinitum. Operation is any process which alters the mutual relation of two or more things. Operations are homogeneous, but distributed amongst different subjects of operation. By what course of calculation can those results be arrived in the shortest time?
  80. 80. Every function can be calculated when continued ad infinitum. Operation is any process which alters the mutual relation of two or more things. Operations are homogeneous, but distributed amongst different subjects of operation. By what course of calculation can those results be arrived in the shortest time?
  81. 81. Every function can be calculated when continued ad infinitum. Operation is any process which alters the mutual relation of two or more things. Operations are homogeneous, but distributed amongst different subjects of operation. By what course of calculation can those results be arrived in the shortest time?
  82. 82. Augustus De Morgan 1806~1871
  83. 83. Augustus De Morgan 1806~1871
  84. 84. Augustus De Morgan 1806~1871
  85. 85. De Morgan’s Laws 𝐴 ∪ 𝐵 = ҧ𝐴 ∩ ത𝐵 𝐴 ∩ 𝐵 = ҧ𝐴 ∪ ത𝐵 ¬ 𝑃⋁𝑄 = ¬𝑃⋀¬𝑄 ¬ 𝑃⋀𝑄 = ¬𝑃⋁¬𝑄 Set Theory Formal Language
  86. 86. De Morgan’s Laws 𝐴 ∪ 𝐵 = ҧ𝐴 ∩ ത𝐵 𝐴 ∩ 𝐵 = ҧ𝐴 ∪ ത𝐵 ¬ 𝑃⋁𝑄 = ¬𝑃⋀¬𝑄 ¬ 𝑃⋀𝑄 = ¬𝑃⋁¬𝑄 Set Theory Formal Language
  87. 87. George Boole 1815~1864
  88. 88. Boole's House and School 3 Pottergate in Lincoln
  89. 89. Boole's House and School 3 Pottergate in Lincoln
  90. 90. Boole's House and School 3 Pottergate in Lincoln
  91. 91. George Boole 1815~1864
  92. 92. Gottfried Wilhelm Leibniz 1646~1716
  93. 93. Sir William Rowan Hamilton 1805~1865
  94. 94. Sir William Rowan Hamilton 1805~1865
  95. 95. Sir William Rowan Hamilton 1805~1865
  96. 96. Sir William Rowan Hamilton 1805~1865
  97. 97. Joseph Hill’s letter (1851) Recalling the meeting of Boole and Babbage
  98. 98. Joseph Hill’s letter (1851) Recalling the meeting of Boole and Babbage As Boole had discovered that means of reasoning might be conducted by a mathematical process, and Babbage had invented a machine for the performance of mathematical work, the two great men together seemed to have taken steps towards the construction of that great prodigy a Thinking Machine.
  99. 99. Joseph Hill’s letter (1851) Recalling the meeting of Boole and Babbage As Boole had discovered that means of reasoning might be conducted by a mathematical process, and Babbage had invented a machine for the performance of mathematical work, the two great men together seemed to have taken steps towards the construction of that great prodigy a Thinking Machine.
  100. 100. Joseph Hill’s letter (1851) Recalling the meeting of Boole and Babbage As Boole had discovered that means of reasoning might be conducted by a mathematical process, and Babbage had invented a machine for the performance of mathematical work, the two great men together seemed to have taken steps towards the construction of that great prodigy a Thinking Machine.
  101. 101. Joseph Hill’s letter (1851) Recalling the meeting of Boole and Babbage As Boole had discovered that means of reasoning might be conducted by a mathematical process, and Babbage had invented a machine for the performance of mathematical work, the two great men together seemed to have taken steps towards the construction of that great prodigy a Thinking Machine.
  102. 102. Claude Shannon 1916~2001
  103. 103. Claude Shannon 1916~2001
  104. 104. “… it just happened that no one else was familiar with both fields at the same time.”
  105. 105. “… it just happened that no one else was familiar with both fields at the same time.”
  106. 106. Switch representation of AND, OR and NOT function
  107. 107. Switch representation of AND, OR and NOT function
  108. 108. Switch representation of AND, OR and NOT function
  109. 109. Howard Hathaway Aiken 1900~1973
  110. 110. Harvard Mark I
  111. 111. Harvard Mark I
  112. 112. Harvard Mark I
  113. 113. “… felt like Babbage was addressing me personally from the past.”
  114. 114. “… felt like Babbage was addressing me personally from the past.”
  115. 115. Harvard Architecture
  116. 116. Harvard Architecture
  117. 117. Grace Hopper 1906~1992
  118. 118. Grace Hopper 1906~1992
  119. 119. First computer bug while working on Mark II (1947)
  120. 120. First computer bug while working on Mark II (1947)
  121. 121. First computer bug while working on Mark II (1947)
  122. 122. John von Neumann 1903~1957
  123. 123. John von Neumann 1903~1957
  124. 124. John von Neumann 1903~1957
  125. 125. Von Neumann Architecture Harvard Architecture One memory simplifies design. But one bus acts as a bottleneck. Two buses are expensive. Control unit is tricky to develop. One unified cache. Two separate cache. Freedom to organize memory. But could be error prone. Free data memory not utilized. H/W accelerated parallel execution impossible. Only simulated by S/W. Parallel access to data and instruction.
  126. 126. Von Neumann Architecture Harvard Architecture One memory simplifies design. But one bus acts as a bottleneck. Two buses are expensive. Control unit is tricky to develop. One unified cache. Two separate cache. Freedom to organize memory. But could be error prone. Free data memory not utilized. H/W accelerated parallel execution impossible. Only simulated by S/W. Parallel access to data and instruction.
  127. 127. Boolean algebra gives operations of reasoning in the symbolical language of calculus. Prepositions are variables. Yes/No in Boolean logic is now 1/0 in circuitry. Variables are just signals before we chose to look at it as variables.
  128. 128. Boolean algebra gives operations of reasoning in the symbolical language of calculus. Prepositions are variables. Yes/No in Boolean logic is now 1/0 in circuitry. Variables are just signals before we chose to look at it as variables.
  129. 129. Boolean algebra gives operations of reasoning in the symbolical language of calculus. Prepositions are variables. Yes/No in Boolean logic is now 1/0 in circuitry. Variables are just signals before we chose to look at it as variables.
  130. 130. Boolean algebra gives operations of reasoning in the symbolical language of calculus. Prepositions are variables. Yes/No in Boolean logic is now 1/0 in circuitry. Variables are just signals before we chose to look at it as variables.
  131. 131. Boolean algebra gives operations of reasoning in the symbolical language of calculus. Prepositions are variables. Yes/No in Boolean logic is now 1/0 in circuitry. Variables are just signals before we chose to look at it as variables.
  132. 132. Boolean algebra gives operations of reasoning in the symbolical language of calculus. Prepositions are variables. Yes/No in Boolean logic is now 1/0 in circuitry. Variables are just signals before we chose to look at it as variables.
  133. 133. Boolean algebra gives operations of reasoning in the symbolical language of calculus. Prepositions are variables. Yes/No in Boolean logic is now 1/0 in circuitry. Variables are just signals before we chose to look at it as variables.
  134. 134. Claude Shannon 1916~2001
  135. 135. “The fundamental problem of communication is that of reproducing at one point from a message selected at another point.”
  136. 136. 𝑁 bits can represent 2 𝑁 numbers. = 𝐶 number of data can be represented by log2 𝐶 bits.
  137. 137. 𝑁 bits can represent 2 𝑁 numbers. = 𝐶 number of data can be represented by log2 𝐶 bits.
  138. 138. Shannon entropy for a biased coin
  139. 139. Shannon entropy for a biased coin
  140. 140. Shannon entropy for a biased coin
  141. 141. Shannon entropy for a biased coin
  142. 142. Shannon entropy for a biased coin
  143. 143. Shannon entropy for a biased coin
  144. 144. Shannon entropy for a biased coin
  145. 145. Shannon entropy for a biased coin
  146. 146. Shannon entropy for a biased coin
  147. 147. Shannon entropy for a biased coin
  148. 148. “Entropy is the average rate at which information is produced by a stochastic source of data.”
  149. 149. “Entropy is the average rate at which information is produced by a stochastic source of data.”
  150. 150. “Lower uniform probability means more bits needed, so more information.”
  151. 151. “Lower uniform probability means more bits needed, so more information.”
  152. 152. “Lower uniform probability means more bits needed, so more information.”
  153. 153. “Lower uniform probability means more bits needed, so more information.”
  154. 154. John von Neumann 1903~1957
  155. 155. “You should call it entropy, for two reasons. In the first place your uncertainty function has been used in statistical mechanics under that name, so it already has a name. In the second place, and more important, nobody knows what entropy really is, so in a debate you will always have the advantage.”
  156. 156. “You should call it entropy, for two reasons. In the first place your uncertainty function has been used in statistical mechanics under that name, so it already has a name. In the second place, and more important, nobody knows what entropy really is, so in a debate you will always have the advantage.”
  157. 157. “You should call it entropy, for two reasons. In the first place your uncertainty function has been used in statistical mechanics under that name, so it already has a name. In the second place, and more important, nobody knows what entropy really is, so in a debate you will always have the advantage.”
  158. 158. “You should call it entropy, for two reasons. In the first place your uncertainty function has been used in statistical mechanics under that name, so it already has a name. In the second place, and more important, nobody knows what entropy really is, so in a debate you will always have the advantage.”
  159. 159. The 2nd Law of Thermodynamics Total entropy of an isolated system can never decrease over time.
  160. 160. The 2nd Law of Thermodynamics Total entropy of an isolated system can never decrease over time.
  161. 161. The 2nd Law of Thermodynamics Total entropy of an isolated system can never decrease over time.
  162. 162. The 2nd Law of Thermodynamics Total entropy of an isolated system can never decrease over time.
  163. 163. Ludwig Boltzmann 1844~1906
  164. 164. Ludwig Boltzmann 1844~1906
  165. 165. Entropy of Gases
  166. 166. Entropy of Gases
  167. 167. Entropy of Gases
  168. 168. Entropy of Gases
  169. 169. Entropy of Gases
  170. 170. Entropy of Gases
  171. 171. Entropy of Gases
  172. 172. Entropy of Gases
  173. 173. Entropy of Gases
  174. 174. Entropy of Gases
  175. 175. Entropy of Gases
  176. 176. Entropy of Gases
  177. 177. Entropy of Gases
  178. 178. Entropy of Gases
  179. 179. 𝐻 𝑦, ො𝑦 = ෍ 𝑖 𝑦𝑖 log 1 ෝ𝑦𝑖 = − ෍ 𝑖 𝑦𝑖 log ෝ𝑦𝑖 Cross Entropy Encoding the i-th symbol wrongly with log ෝ𝑦𝑖 bits. Therefore always larger than entropy.
  180. 180. 𝐻 𝑦, ො𝑦 = ෍ 𝑖 𝑦𝑖 log 1 ෝ𝑦𝑖 = − ෍ 𝑖 𝑦𝑖 log ෝ𝑦𝑖 Cross Entropy Encoding the i-th symbol wrongly with log ෝ𝑦𝑖 bits. Therefore always larger than entropy.
  181. 181. 𝐻 𝑦, ො𝑦 = ෍ 𝑖 𝑦𝑖 log 1 ෝ𝑦𝑖 = − ෍ 𝑖 𝑦𝑖 log ෝ𝑦𝑖 Cross Entropy Encoding the i-th symbol wrongly with log ෝ𝑦𝑖 bits. Therefore always larger than entropy.
  182. 182. 𝐻 𝑦, ො𝑦 = ෍ 𝑖 𝑦𝑖 log 1 ෝ𝑦𝑖 = − ෍ 𝑖 𝑦𝑖 log ෝ𝑦𝑖 Cross Entropy Encoding the i-th symbol wrongly with log ෝ𝑦𝑖 bits. Therefore always larger than entropy.
  183. 183. 𝐻 𝑦, ො𝑦 = ෍ 𝑖 𝑦𝑖 log 1 ෝ𝑦𝑖 = − ෍ 𝑖 𝑦𝑖 log ෝ𝑦𝑖 Cross Entropy Encoding the i-th symbol wrongly with log ෝ𝑦𝑖 bits. Therefore always larger than entropy.
  184. 184. 𝐻 𝑦, ො𝑦 = ෍ 𝑖 𝑦𝑖 log 1 ෝ𝑦𝑖 = − ෍ 𝑖 𝑦𝑖 log ෝ𝑦𝑖 Cross Entropy Encoding the i-th symbol wrongly with log ෝ𝑦𝑖 bits. Therefore always larger than entropy.
  185. 185. 𝐻 𝑦, ො𝑦 = ෍ 𝑖 𝑦𝑖 log 1 ෝ𝑦𝑖 = − ෍ 𝑖 𝑦𝑖 log ෝ𝑦𝑖 Cross Entropy Encoding the i-th symbol wrongly with log ෝ𝑦𝑖 bits. Therefore always larger than entropy.
  186. 186. 𝐻 𝑦, ො𝑦 = ෍ 𝑖 𝑦𝑖 log 1 ෝ𝑦𝑖 = − ෍ 𝑖 𝑦𝑖 log ෝ𝑦𝑖 Cross Entropy Encoding the i-th symbol wrongly with log ෝ𝑦𝑖 bits. Therefore always larger than entropy.
  187. 187. 𝐻 𝑦, ො𝑦 = ෍ 𝑖 𝑦𝑖 log 1 ෝ𝑦𝑖 = − ෍ 𝑖 𝑦𝑖 log ෝ𝑦𝑖 Cross Entropy Encoding the i-th symbol wrongly with log ෝ𝑦𝑖 bits. Therefore always larger than entropy.
  188. 188. 𝐻 𝑦, ො𝑦 = ෍ 𝑖 𝑦𝑖 log 1 ෝ𝑦𝑖 = − ෍ 𝑖 𝑦𝑖 log ෝ𝑦𝑖 Cross Entropy Encoding the i-th symbol wrongly with log ෝ𝑦𝑖 bits. Therefore always larger than entropy.
  189. 189. 𝐻 𝑦, ො𝑦 = ෍ 𝑖 𝑦𝑖 log 1 ෝ𝑦𝑖 = − ෍ 𝑖 𝑦𝑖 log ෝ𝑦𝑖 Cross Entropy Encoding the i-th symbol wrongly with log ෝ𝑦𝑖 bits. Therefore always larger than entropy.
  190. 190. 𝐻 𝑦, ො𝑦 = ෍ 𝑖 𝑦𝑖 log 1 ෝ𝑦𝑖 = − ෍ 𝑖 𝑦𝑖 log ෝ𝑦𝑖 Cross Entropy Encoding the i-th symbol wrongly with log ෝ𝑦𝑖 bits. Therefore always larger than entropy.
  191. 191. KL Divergence Number of extra bits needed if wrongly encoded. Minimizing KL divergence is same with minimizing cross entropy. 𝐾𝐿(𝑦| ො𝑦 = 𝐻 𝑦, ො𝑦 − 𝐻(𝑦, 𝑦) = ෍ 𝑖 𝑦𝑖 log 𝑦𝑖 ෝ𝑦𝑖
  192. 192. KL Divergence Number of extra bits needed if wrongly encoded. Minimizing KL divergence is same with minimizing cross entropy. 𝐾𝐿(𝑦| ො𝑦 = 𝐻 𝑦, ො𝑦 − 𝐻(𝑦, 𝑦) = ෍ 𝑖 𝑦𝑖 log 𝑦𝑖 ෝ𝑦𝑖
  193. 193. KL Divergence Number of extra bits needed if wrongly encoded. Minimizing KL divergence is same with minimizing cross entropy. 𝐾𝐿(𝑦| ො𝑦 = 𝐻 𝑦, ො𝑦 − 𝐻(𝑦, 𝑦) = ෍ 𝑖 𝑦𝑖 log 𝑦𝑖 ෝ𝑦𝑖
  194. 194. KL Divergence Number of extra bits needed if wrongly encoded. Minimizing KL divergence is same with minimizing cross entropy. 𝐾𝐿(𝑦| ො𝑦 = 𝐻 𝑦, ො𝑦 − 𝐻(𝑦, 𝑦) = ෍ 𝑖 𝑦𝑖 log 𝑦𝑖 ෝ𝑦𝑖
  195. 195. KL Divergence Number of extra bits needed if wrongly encoded. Minimizing KL divergence is same with minimizing cross entropy. 𝐾𝐿(𝑦| ො𝑦 = 𝐻 𝑦, ො𝑦 − 𝐻(𝑦, 𝑦) = ෍ 𝑖 𝑦𝑖 log 𝑦𝑖 ෝ𝑦𝑖
  196. 196. “Entropy is the average rate at which information is produced by a stochastic source of data.”
  197. 197. “Entropy is the average rate at which information is produced by a stochastic source of data.”
  198. 198. “Entropy is the average rate at which information is produced by a stochastic source of data.”
  199. 199. Everything is information.
  200. 200. Everything is information.
  201. 201. Entropy is the average rate at which information is produced by a stochastic source of data. Basically, everything is information. We are the one who is operating it.
  202. 202. Entropy is the average rate at which information is produced by a stochastic source of data. Basically, everything is information. We are the one who is operating it.
  203. 203. Entropy is the average rate at which information is produced by a stochastic source of data. Basically, everything is information. We are the one who is operating it.
  204. 204. Entropy is the average rate at which information is produced by a stochastic source of data. Basically, everything is information. We are the one who is operating it.
  205. 205. Entropy is the average rate at which information is produced by a stochastic source of data. Basically, everything is information. We are the one who is operating it.
  206. 206. Entropy is the average rate at which information is produced by a stochastic source of data. Basically, everything is information. We are the one who is operating it.
  207. 207. Gödel's theorem and Systems of Logic Based on Ordinals. Turing machine and Halting problem. Church-Turing Conjecture and Lambda Calculus. “Lady Lovelace’s Objection” and Turing test.
  208. 208. Gödel's theorem and Systems of Logic Based on Ordinals. Turing machine and Halting problem. Church-Turing Conjecture and Lambda Calculus. “Lady Lovelace’s Objection” and Turing test.
  209. 209. Gödel's theorem and Systems of Logic Based on Ordinals. Turing machine and Halting problem. Church-Turing Conjecture and Lambda Calculus. “Lady Lovelace’s Objection” and Turing test.
  210. 210. Gödel's theorem and Systems of Logic Based on Ordinals. Turing machine and Halting problem. Church-Turing Conjecture and Lambda Calculus. “Lady Lovelace’s Objection” and Turing test.
  211. 211. Gödel's theorem and Systems of Logic Based on Ordinals. Turing machine and Halting problem. Church-Turing Conjecture and Lambda Calculus. “Lady Lovelace’s Objection” and Turing test.
  212. 212. Gödel's theorem and Systems of Logic Based on Ordinals. Turing machine and Halting problem. Church-Turing Conjecture and Lambda Calculus. “Lady Lovelace’s Objection” and Turing test.
  213. 213. Gödel's theorem and Systems of Logic Based on Ordinals. Turing machine and Halting problem. Church-Turing Conjecture and Lambda Calculus. “Lady Lovelace’s Objection” and Turing test.
  214. 214. Gödel's theorem and Systems of Logic Based on Ordinals. Turing machine and Halting problem. Church-Turing Conjecture and Lambda Calculus. “Lady Lovelace’s Objection” and Turing test.
  215. 215. Gödel's theorem and Systems of Logic Based on Ordinals. Turing machine and Halting problem. Church-Turing Conjecture and Lambda Calculus. “Lady Lovelace’s Objection” and Turing test.
  216. 216. Will you give me poetical philosophy, poetical science?
  217. 217. Will you give me poetical philosophy, poetical science?
  218. 218. Will you give me poetical philosophy, poetical science?

×