In Deep Learning, learning RBM is basic hierarchical components of the layer. In this slide, we can learn basic components of RBM (bipartite graph, Gibbs Sampling, Contrastive Divergence (1-CD), Energy function of entropy).
4. A Prac'cal Guide to Training Restricted Boltzmann Machine
• Overview
• RBM Requires 7 meta parameters to learn
– Learning Rate
– The Momentum
– The weight-cost
– The sparsity target
– The ini'al values of the weights
– The number of hidden units
– The size of each mini-batch
• But this does not explain why the decisions were made
or how minor changes will affect performance
Aug 2010, Geoffrey Hinton (University of Toronto)
A comparison of Neural Network Architectures
4
26. An Analysis of Single-Layer Networks in Unsupervised Feature Learning
• Effec've Learning Features of 1-Hidden Layer RBM
– Features (# of hidden nodes)
– Recep've Fields (Filters, Field size)
– Whitening
26
2011, Honglak Lee
There are two things we are trying to accomplish with whitening:
1. Make the features less correlated with one another.
2. Give all of the features the same variance.
Whitening has two simple steps:
1. Project the dataset onto the eigenvectors. This rotates the dataset so that there is no correlation
between the components.
2. Normalize the the dataset to have a variance of 1 for all components. This is done by simply dividing
each component by the square root of its eigenvalue.