neural_network.activation_functions.rectified_linear_unit¶
This script demonstrates the implementation of the ReLU function.
It’s a kind of activation function defined as the positive part of its argument in the context of neural network. The function takes a vector of K real numbers as input and then argmax(x, 0). After through ReLU, the element of the vector always 0 or real number.
Script inspired from its corresponding Wikipedia article https://en.wikipedia.org/wiki/Rectifier_(neural_networks)
Functions¶
|
Implements the relu function |
Module Contents¶
- neural_network.activation_functions.rectified_linear_unit.relu(vector: list[float])¶
Implements the relu function
- Parameters:
vector (np.array,list,tuple): A numpy array of shape (1,n) consisting of real values or a similar list,tuple
- Returns:
relu_vec (np.array): The input numpy array, after applying relu.
>>> vec = np.array([-1, 0, 5]) >>> relu(vec) array([0, 0, 5])