Activation functions

The activation ops provide different types of nonlinearities for use in neural networks. These include smooth nonlinearities, such as sigmoid, tanh, elu, softplus, and softsign. On the other hand, some continuous but not-everywhere-differentiable functions that can be used are relu, relu6, crelu, and relu_x. All activation ops apply component-wise and produce a tensor of the same shape as the input tensor. Now let us see how to use a few commonly used activation functions in TensorFlow syntax.

Get Practical Convolutional Neural Networks now with the O’Reilly learning platform.

O’Reilly members experience books, live events, courses curated by job role, and more from O’Reilly and nearly 200 top publishers.