Backpropagation through time (BPTT)

The simplest way to train an RNN is based on a representational trick. As the input sequences are limited and their length can be fixed, it's possible to restructure the simple neuron with a feedback connection as an unrolled feed-forward network. In the following diagram, there's an example with k timesteps:

Example of unrolled recurrent network

This network (which can be easily extended to more complex architecture with several layers) is exactly like an MLP, but in this case, the weights of each clone are the same. The algorithm called BPTT is the natural extension of the standard learning technique to ...

Get Mastering Machine Learning Algorithms now with the O’Reilly learning platform.

O’Reilly members experience books, live events, courses curated by job role, and more from O’Reilly and nearly 200 top publishers.