Elements of the DNN model

Before running the model, we first have to determine the elements that we will use in building a multilayer perceptron model. Following are the elements that we will use in this model:

vector = np.arange(-5,5,0.1)
def relu(x) :
return max(0.,x)
relu = np.vectorize(relu)

If the input is negative, the function outputs 0, and if the input is positive the function just outputs the same value as the input. So, mathematically, the ReLU function looks similar to this. The following screenshot shows the lines of code used for generating the graphical representation of the ReLU activation function:

It gains the maximum between 0 and the input. This activation function will be used in every neuron of the hidden layers.