Rectified Linear Units

Rectified Linear Units (ReLU) play the role of neuronal activation function in neural networks. A ReLU level is composed of neurons that apply the function f(x) = max (0, x). These levels increase the non-linearity of the network and at the same time do not modify the receiving fields of convolution levels. The function of the ReLUs is preferred over others, such as the hyperbolic tangent or the sigmoid, since, in comparison to these, it leads to a much faster training process without significantly affecting the generalization accuracy.

Get Hands-On Machine Learning on Google Cloud Platform now with the O’Reilly learning platform.

O’Reilly members experience books, live events, courses curated by job role, and more from O’Reilly and nearly 200 top publishers.