Exploring Artificial Intelligence and in practice, with several network architectures implemented on the fly in a live coding session. In this talk we went through Shallow / Intermediate / Deep / Convolutional and Residual Networks. Showing the differences in hyperparameters tuning, which to use and when.
In addition, also how they came to life though the mind of computer scientists like Geoffrey Hinton, Yoshua Bengio, Xavier Glorot and Adam Coates.
The code presented in this talk can be found here: https://github.com/ekholabs/DLinK
2. Who am I?
Software & Machine Learning
Engineer;
City.AI Ambassador;
IBM Watson AI XPRIZE contestant;
Kaggler;
Guest attendee at AI for
Good Global Summit at the UN;
X-Men geek;
family man and father of 5 (3
kids and 2 cats).
@wilderrodrigues
16. Capsule Networks
The is no pose
(translational and
rotational) relationship
between simpler features.
Successive convolutional or
max pooling layers to
reduce spacial size.
18. Resources and References
https://github.com/ekholabs/DLinK
Machine Learning: Andrew Ng, Stanford University, Coursera.
Neural Networks for Machine Learning: Geoffrey Hinton, University of Toronto, Coursera.
Computational Neuroscience: Rajesh Rao & Adrienne Fairhall, Washington University, Coursera.
Neural Networks and Deep Learning: Andrew Ng, DeepLearning.ai, Coursera.
Structuring Machine Learning Projects: Andrew Ng, DeepLearning.ai, Coursera.
Improving Deep Neural Networks with Hyperparameter Tuning, Regularisation and Optimisation: Andrew
Ng, DeepLearning.ai, Coursera.
Convolutional Neural Networks: Andrew Ng, DeepLearning.ai, Coursera.
Calculus I: Jim Fowler, Ohio State University, Coursera.