Leaky ReLU is a type of activation function that tries to solve the Dying ReLU problem.
A traditional rectified linear unit f(x) returns 0 when x≤0. The Dying ReLU problem refers to when the unit gets stuck this way–always returning 0 for any input.
Leaky ReLU aims to fix this by returning a small, negative, non-zero value instead of 0, as such:
f(x)={max(0,x)x>0αxx≤0 where α is typically a small value like α=0.0001.