ReLU (Rectified Linear Unit)

What is ReLU (Rectified Linear Unit)?

ReLU is an activation function f(x) = max(0, x). It is computationally efficient and helps alleviate the vanishing gradient problem.

Where did the term "ReLU (Rectified Linear Unit)" come from?

Key 2010s innovation that enabled deeper networks.

How is "ReLU (Rectified Linear Unit)" used today?

The most widely used activation function in vision.

Related Terms