Long short-term memory networks

LSTM is a particular architecture of RNN, originally conceived by Hochreiter and Schmidhuber in 1997. This type of neural network has been recently rediscovered in the context of deep learning because it is free from the problem of vanishing gradient, and in practice it offers excellent results and performance.

The vanishing gradient problem affects the training of ANNs with gradient-based learning methods. In gradient-based methods such as backpropagation, weights are adjusted proportionally to the gradient of the error. Because of the way in which the aforementioned gradients are calculated, we obtain the effect that their module decreases exponentially, proceeding towards the deepest layers. The problem ...

Get Hands-On Machine Learning on Google Cloud Platform now with the O’Reilly learning platform.

O’Reilly members experience books, live events, courses curated by job role, and more from O’Reilly and nearly 200 top publishers.