What is Multilayer Perceptron Model?
The Problems of Single layer Perceptron model is overcomed by Multilayer model. It contains more than one perceptron model, Perceptron model contains input layer , output layer for the prediction of the outputs and many hidden layers , this is the extra layer added.
The XOR problem is the reason behind the invention of MLP (Multilayer Perceptron Model).
The XOR problem cannot be solved by single layer perceptron model , the following image illustratte this.
We can observe the above image , a single line cannot able to separate the + and – points . Hence XOR problem is one of the famous problem in neural network.
The following Figure illustrate the Multilayer perceptron architecture.
Minsky never told XOR problem cannot be solved but he told only XOR problem cannot be solved by 1 layer perceptron model but he built a Multilayer Perceptron model to solve the XOR problem or non linearly separable problems.
The above Multilayer Perceptron model shows three layers , they are categorised as : input layer , hidden layer and output layer.
- It is the starting layer ,which introduces the input values into the network.
- Activation function not required or any other processing is not required.
- Performs classification of features.
- Maximum two hidden layers are sufficient to solve any problems.
- Enables better classification of images.
- This layers incorporates both activation function and bias for each node.
- Functions like hidden layer .
- Uses activation function and bias for firing of the node.
- Outputs are passed to real world
MLP in SCIKIT Library
The following points needs to be remembered for MLP in SCIKIT library.
- The activation function is not required in the output layer.
- Works well with single or multiple values.
- It doesn’t support GPU .
- The loss function used is the cross entropy and loss function is mean square error.
The code for the implementation of MLP us shown below:
SCIKIT library Code
Import the necessary library
from sklearn.neural_network import MLPRegressor from sklearn.model_selection import train_test_split from sklearn.datasets import fetch_california_housing from sklearn.preprocessing import StandardScaler from sklearn.metrics import r2_score import pandas as pd
Splitting the dataset into 30% and 70 %
cal_housing = fetch_california_housing() X = pd.DataFrame(cal_housing.data,columns=cal_housing.feature_names) y = cal_housing.target X_train, X_test, y_train, y_test = train_test_split(X, y,random_state=1, test_size=0.2)
Use the standard scale , used the three hidden layers with 64 neurons in each layer
sc_X = StandardScaler() X_trainscaled=sc_X.fit_transform(X_train) X_testscaled=sc_X.transform(X_test)
Use the MLP regressor with relu as the activation function.
y_pred=reg.predict(X_testscaled) print("The Score with ", (r2_score(y_pred, y_test))
Run the above code in python IDLE or Jupyter Notebook ,I got an accuracy of 75% .
Overall MLP is the neural network capable to solve many complex problems ,in the next section I will add classification problem for MLP .Hope you enjoyed the article ,please share and comment below in the section.