Slideshare uses cookies to improve functionality and performance, and to provide you with relevant advertising. If you continue browsing the site, you agree to the use of cookies on this website. See our User Agreement and Privacy Policy.
Slideshare uses cookies to improve functionality and performance, and to provide you with relevant advertising. If you continue browsing the site, you agree to the use of cookies on this website. See our Privacy Policy and User Agreement for details.
Published on
Exploring Artificial Intelligence and in practice, with several network architectures implemented on the fly in a live coding session. In this talk we went through Shallow / Intermediate / Deep / Convolutional and Residual Networks. Showing the differences in hyperparameters tuning, which to use and when.
In addition, also how they came to life though the mind of computer scientists like Geoffrey Hinton, Yoshua Bengio, Xavier Glorot and Adam Coates.
The code presented in this talk can be found here: https://github.com/ekholabs/DLinK
Login to see the comments