There's more...

In this section, we compared the ReLU activation function and the sigmoid activation function for neural networks. There are many other activation functions that are commonly-used for neural networks, but most fall into either one of two categories; the first category contains functions that are shaped like the sigmoid function, such as arctan, hypertangent, heaviside step, and so on; the second category contains functions that are shaped such as the ReLU function, such as softplus, leaky ReLU, and so on. Most of what we discussed in this section about comparing the two functions will hold true for activations in either category. However, it is important to note that the choice of the activation function has a big impact on ...

Get TensorFlow Machine Learning Cookbook - Second Edition now with the O’Reilly learning platform.

O’Reilly members experience books, live events, courses curated by job role, and more from O’Reilly and nearly 200 top publishers.