Using ReLU

In TensorFlow, the signature tf.nn.relu(features, name=None) computes a rectified linear using max(features, 0) and returns a tensor having the same type as features. Here is the parameter description:

  • features: A tensor. This must be one of the following types: float32, float64, int32, int64, uint8, int16, int8, uint16, and half.
  • name: A name for the operation (optional).

For more on how to use other activation functions, please refer to the TensorFlow website. Up to this point, we have the minimal theoretical knowledge to build our first CNN network for making a prediction.

Get Practical Convolutional Neural Networks now with the O’Reilly learning platform.

O’Reilly members experience books, live events, courses curated by job role, and more from O’Reilly and nearly 200 top publishers.