Deep Belief Networks or DBN is a state of the art algorithm that uses unsupervised learning machine model to produce the output. Each Deep Belief Network uses a deep architecture and also uses Restricted Boltzmann Machines stacked on top of each other.

It is composed of multiple layers of stochastic variables and these variables are binary in nature, also referred to as feature detectors or concealed units.

First let us understand what Restricted Boltzmann Machines are:


Boltzmann machines are named after Boltzmann Distribution( also known as Gibbs Distribution) and are an important subject in the domain of Statistical Mechanics. They were invented by Geoffrey Hinton and Terry Sejnowsky respectively.

Restricted Boltzmann Machines are a multi layered building block that is stochastic in nature and were also invented by the same data scientists. They are used in cases such as:

Modeling of a topic

Feature Learning

Regression Model

Classification Model

Collaborative Filtering


DBNS models have the built in capacity of being a greedy learning algorithm. This type of model layer by layer approach for understanding the concept of generative weights. These weights determine the fate of how variables of one layer will form a relationship with the other layer.

Several tests of Gibbs sampling are performed on the hidden layers (top two) from RBM. We use one familial sampling throughout the whole model and from the units known to us to generate results.


They are used for:

Image Generation

Image Classification

Motion Capture

Leave a Reply

%d bloggers like this: