A perceptron Algorithm is one of the first algorithm in Neural Network. It is developed by Rossenbelt in 1957.
Actually he proposed the Perceptron for binary classification. The Main idea of perceptron with mathematical model will be discussed in the subsequent paragraphs.
In the above diagram ,we shown the working nature and idea behind the perceptron for linear separability.
- We assign one Wi for each input Xi.
- Multiply weights with Xi and the bias at the end ,which is always 1.
- If the output larger than threshold ,we return 1 otherwise 0. The activation function is used for this purpose.
Linear separability is one of the important point in the neural network , it illustrate ,can we able to separate the points using single dimension line. The following diagram illustrate this:
In the above image , A line clearly separates the blue and red points without any hurdle. But a single line cannot solve all the problems.
This linear separability works by initializing the weights randomly , it will take one sample at a time and predict Yi.
If we get the error in the output , then we will update the weights either by decreasing or increasing the weight .
We need to repeat this procedure until we get no error.
There are some problems are present ,which cannot be solved. by single perceptron model , in that case we need to adopt the multiple layer model.
In this post I used many technical terms like activation function and bias . I will discus each one with its’s role in the neural network.
Activation function is used to transform the neurons input to output. In neural network ,we will not get the required output .
we need to normalize the values , to perform the normalization ,we use the different activation function , the examples are step function , Sigmoid function sign function.
The different activations function images are as follows.
The multi layer perceptron Model
The non linear separable problems can be effectively solved by using multi layer perceptron model . I will discuss in detail in future posts.