O'Reilly logo
  • Dmitry B thinks this is interesting:

In this section, we will compare and contrast two different activation functions, the sigmoid and the rectified linear unit (ReLU). Recall that the two functions are given by the following equations:

From

Cover of TensorFlow Machine Learning Cookbook

Note

sigmoid function f(x) = 1/(1 + e^-x)