Before driving our mind towards neural network , we first need to understand the bricks which lead to the beautiful discovery. The world is surrounded by data and proper processing of it will give wonderful insights. Let us understand the above statement by a toy example.
Let us consider the table below

X -> Independent variable (Its value is used to describe the Y’s value)
Y -> Dependent variable (Its value is inferred from the X’s value)
In the above table , we are given a set of values for both X and Y. Our task is to find the missing value so that the table will be filled completely. Yes, you’r right it’s 6. Have a chocolate to celebrate your victory. When we rewind ourselves and think about how we discovered the missing number , all we did was found the pattern or relationship between the two variables(X and Y). We don’t know what these variables(X and Y) represent, but we only know the relationship between them. Our brains are good at recognizing patterns in the data. But if those two variables represent the budget and profit of a company , then if we find a pattern between those then we could estimate the profit(Y) for a unknown budget(X).This would help us in analyzing our company’s profit.When we model the above situation in terms of mathematics, life would be easier.
Lets get started,
The mathematical equation representing the above table is Y = 1 * X .Why ??
Because every value in Y is nothing but X . This equation is linear because when we plot the X and Y value in the graph and connect those points , it will form a straight line.

The task was simpler because it contained few points , but what if the data were large and it contained complex relationships like (Y= 3.5 * X , Y = 2.34 * X), we are unable to capture the pattern without getting our hands dirty. But, wait we have our friend Mr.Neural Network who is expert in finding pattern but dumb in thinking!!. We will dig into what is Neural Network ? and how it could solve our problem ..
Neural Network is nothing but a bunch of neurons like in our brain which are interconnected together.Our brain contains millions of neurons thus forming a dense network.So what are neurons here??? The neurons here are nothing but the two variables we discussed above. The neuron X which takes a input and computes a output neuron Y. The relationship between the two neurons(Y=1 * X) is nothing but the weight of the neuron which is “1”.Let us understand it visually(Because our brain are good at interpreting visuals).

So, the X neuron takes a input and multiplies itself with w and produces the output neuron Y. This combination of input and output neuron forms a network where the connection is established through the weights and hence called a neural network.Our brains contains millions of neurons to find complex relationships between inputs and outputs , but in our toy example the relationship was simple so a single neuron is enough.
Some terminologies used in Deep Learning:
The layer where the input neuron X is present is known as Input Layer.
The layer where the output neuron Y is present is known as Output Layer.
If we adjust the weight in the above network , then we could capture different relationship.In our example it was easy to figure out the weight,but what if the relationship was complex.We somehow need to find the optimal weight which captures the relationship between the input neuron and output neuron.Its like solving a puzzle or finding patterns in a data.The method used to find the optimal weight is known as gradient descent using backpropagation. Sounds weird right??..We will cover this topic in the next blog post…