A Rectified Linear Unit is a common name for a neuron (the “unit”) with an activation function of .
Neural networks built with ReLU have the following advantages:
- gradient computation is simpler because the activation function is computationally similar than comparable activation functions like .
- Neural networks with ReLU are less susceptible to the vanishing gradient problem but may suffer from the dying ReLU problem.