2. OCR: picture of text -> text And Bruno, conqueror of Carthage, strode up to me and said: "Devil take you, Edith!" "Finally, you scoundrel - are you going to confess your love for me?" I retorted. The German warrior stood stoically. He surveyed the landscape before him; grinned; spoke: "You are much better with an axe than Jane - I grant you that." (Killing, I admit, was my favourite pastime. Long before I enlisted in the Order of the Knights of Malta, I liked playing with knives. No-one objected.) "Overall, how would you rank/rate my performance in axing?" "Performance evaluations are meaningless!" (Quite true.) Radically changing the topic, I asked: "So, what are your thoughts on Empress Teresa?" "Unshareable; irrelevant; bitter." "Secret? Very unsurprising. We warrior/troubadours are quite reserved - nay - silent." (Xenophobia played a role too. You knew that. So did my friend, Zoe.)
39. Do this 100000 times: What's this letter? Is it an A? No, it's a Q. What's this letter? Is it a B? No, it's a P. What's this letter? Is it a W? Yes, it's a W. etc.
40.
41. At least one layer between the input and the output, of similar size -> 1 million connections -> lots of CPU time
42. All neurons in the hidden layer(s) are given the same inputs and targets, so they gravitate towards behaving the same
43.
44. Gives pretty much the same response to all inputs: “Meh, it could be any letter really”
49. Causes network to notice local features (eg horizontal lines) consistently -0.133 -0.133 -0.133 -0.133
50.
51. Train network to output different patterns for each letter instead of just exciting one neuron per letter. NNs learn better if each output neuron is trained to be active roughly half of the time.