Chapter 5. Generating Text in the Style of an Example Text
In this chapter weâll look at how we can use recurrent neural networks (RNNs) to generate text in the style of a body of text. This makes for fun demos. People have used this type of network to generate anything from names of babies to descriptions of colors. These demos are a good way to get comfortable with recurrent networks. RNNs have their practical uses tooâlater in the book weâll use them to train a chatbot and build a recommender system for music based on harvested playlists, and RNNs have been used in production to track objects in video.
The recurrent neural network is a type of neural network that is helpful when working with time or sequences. Weâll first look at Project Gutenberg as a source of free books and download the collected works of William Shakespeare using some simple code. Next, weâll use an RNN to produce texts that seem Shakespearean (if you donât pay too much attention) by training the network on downloaded text. Weâll then repeat the trick on Python code, and see how to vary the output. Finally, since Python code has a predictable structure, we can look at which neurons fire on which bits of code and visualize the workings of our RNN.
The code for this chapter can be found in the following Python notebook:
05.1 Generating Text in the Style of an Example Text
5.1 Acquiring the Text of Public Domain Books
Problem
You want to download the full text of some public domain books to ...
Get Deep Learning Cookbook now with the O’Reilly learning platform.
O’Reilly members experience books, live events, courses curated by job role, and more from O’Reilly and nearly 200 top publishers.