O'Reilly logo
  • Dmitry B thinks this is interesting:

In this section, we will compare and contrast two different activation functions, the sigmoid and the rectified linear unit (ReLU). Recall that the two functions are given by the following equations:


Cover of TensorFlow Machine Learning Cookbook


sigmoid function f(x) = 1/(1 + e^-x)