Search Results

Dying ReLU

Dying ReLU refers to a problem when training neural networks with rectified linear units (ReLU). The unit dies when it only outputs 0 for any given input.

When training with stochastic gradient descent, the unit is not likely to return to life, and the unit will no longer be useful during training.

Leaky ReLU is a variant that solves the Dying ReLU problem by returning a small value when the input \(x\) is less than 0.