Dmitry B thinks this is interesting: In this section, we will compare and contrast two different activation functions, the sigmoid and the rectified linear unit (ReLU). Recall that the two functions are given by the following equations: From Working with Gates and Activation Functions from TensorFlow Machine Learning Cookbook by Nick McClure Publisher: Packt Publishing Released: February 2017 Note sigmoid function f(x) = 1/(1 + e^-x) Share this highlight http://learning.oreilly.com/a/tensorflow-machine-learning/7818429/ Twitter Facebook Google Plus Email Get Instant Access Now Start a Free Trial Have an account? Sign in. Minimise Unlock the rest of TensorFlow Machine Learning Cookbook and 30,000 other books and videos By clicking this box, you confirm that you have read and agree to the terms and conditions of our Membership Agreement, and you understand that when your trial period ends, you will be required to provide billing information if you wish to continue using the service. Unlock the rest of this book Start a Free 10-Day Trial loading Learn about Safari for Business Have an account? Sign in.