Deep LearningTraining & Optimisation

ReLU

Overview

Rectified Linear Unit — an activation function that outputs the input directly if positive, otherwise outputs zero.

Cross-References(1)

Deep Learning

More in Deep Learning