Summary

LSTM networks are equipped with special hidden units, called memory cells, whose purpose is to remember the previous input for a long time. These cells take, at each instant of time, the previous state and the current input of the network as input. By combining them with the current contents of memory, and deciding what to keep and what to delete from memory with a gating mechanism by other units, LSTM has proved to be very useful and an effective way of learning long-term dependency.

In this chapter, we discussed RNNs. We saw how to make predictions with data that has a high temporal dependency. We saw how to develop several real-life predictive models that make the predictive analytics easier using RNNs and the different architectural ...

Get Deep Learning with TensorFlow - Second Edition now with the O’Reilly learning platform.

O’Reilly members experience books, live events, courses curated by job role, and more from O’Reilly and nearly 200 top publishers.