Activation functions

Picking good activation functions makes training much easier. Also, the activation function choice may entail at least two shoulds: the way you should transform data and the way binaries (if there are any) should be formatted. There is an infinity of activation functions available; actually, it could be any continuous function—you can make one of your own.

The only requirement for a function to be eligible as an activation function is to be derivable, or at least that you can assume a reasonable proxy for the points you can't derive.

Here is are a list of popular activation functions:

  • Rectified Linear Unit (ReLU):

Get Hands-On Data Science with R now with the O’Reilly learning platform.

O’Reilly members experience books, live events, courses curated by job role, and more from O’Reilly and nearly 200 top publishers.