Hands-On Natural Language Processing with Python

Book description

Foster your NLP applications with the help of deep learning, NLTK, and TensorFlow

Key Features

  • Weave neural networks into linguistic applications across various platforms
  • Perform NLP tasks and train its models using NLTK and TensorFlow
  • Boost your NLP models with strong deep learning architectures such as CNNs and RNNs

Book Description

Natural language processing (NLP) has found its application in various domains, such as web search, advertisements, and customer services, and with the help of deep learning, we can enhance its performances in these areas. Hands-On Natural Language Processing with Python teaches you how to leverage deep learning models for performing various NLP tasks, along with best practices in dealing with today's NLP challenges.

To begin with, you will understand the core concepts of NLP and deep learning, such as Convolutional Neural Networks (CNNs), recurrent neural networks (RNNs), semantic embedding, Word2vec, and more. You will learn how to perform each and every task of NLP using neural networks, in which you will train and deploy neural networks in your NLP applications. You will get accustomed to using RNNs and CNNs in various application areas, such as text classification and sequence labeling, which are essential in the application of sentiment analysis, customer service chatbots, and anomaly detection. You will be equipped with practical knowledge in order to implement deep learning in your linguistic applications using Python's popular deep learning library, TensorFlow.

By the end of this book, you will be well versed in building deep learning-backed NLP applications, along with overcoming NLP challenges with best practices developed by domain experts.

What you will learn

  • Implement semantic embedding of words to classify and find entities
  • Convert words to vectors by training in order to perform arithmetic operations
  • Train a deep learning model to detect classification of tweets and news
  • Implement a question-answer model with search and RNN models
  • Train models for various text classification datasets using CNN
  • Implement WaveNet a deep generative model for producing a natural-sounding voice
  • Convert voice-to-text and text-to-voice
  • Train a model to convert speech-to-text using DeepSpeech

Who this book is for

Hands-on Natural Language Processing with Python is for you if you are a developer, machine learning or an NLP engineer who wants to build a deep learning application that leverages NLP techniques. This comprehensive guide is also useful for deep learning users who want to extend their deep learning skills in building NLP applications. All you need is the basics of machine learning and Python to enjoy the book.

Table of contents

  1. Title Page
  2. Copyright and Credits
    1. Hands-On Natural Language Processing with Python
  3. Packt Upsell
    1. Why subscribe?
    2. PacktPub.com
  4. Foreword
  5. Contributors
    1. About the authors
    2. About the reviewer
    3. Packt is searching for authors like you
  6. Preface
    1. Who this book is for
    2. What this book covers
    3. To get the most out of this book
      1. Download the example code files
      2. Download the color images
      3. Conventions used
    4. Get in touch
      1. Reviews
  7. Getting Started
    1. Basic concepts and terminologies in NLP
      1. Text corpus or corpora
      2. Paragraph
      3. Sentences
      4. Phrases and words
      5. N-grams
      6. Bag-of-words
    2. Applications of NLP
      1. Analyzing sentiment
      2. Recognizing named entities
      3. Linking entities
      4. Translating text
      5. Natural Language Inference
      6. Semantic Role Labeling
      7. Relation extraction
      8. SQL query generation, or semantic parsing
      9. Machine Comprehension
      10. Textual Entailment
      11. Coreference resolution
      12. Searching
      13. Question answering and chatbots
      14. Converting text-to-voice
      15. Converting voice-to-text
      16. Speaker identification
      17. Spoken dialog systems
      18. Other applications
    3. Summary
  8. Text Classification and POS Tagging Using NLTK
    1. Installing NLTK and its modules
    2. Text preprocessing and exploratory analysis
      1. Tokenization
      2. Stemming
      3. Removing stop words
    3. Exploratory analysis of text
    4. POS tagging
      1. What is POS tagging?
      2. Applications of POS tagging
      3. Training a POS tagger
    5. Training a sentiment classifier for movie reviews
    6. Training a bag-of-words classifier
    7. Summary
  9. Deep Learning and TensorFlow
    1. Deep learning
      1. Perceptron
      2. Activation functions
        1. Sigmoid
        2. Hyperbolic tangent
        3. Rectified linear unit
      3. Neural network
        1. One-hot encoding
        2. Softmax
        3. Cross-entropy
      4. Training neural networks
        1. Backpropagation
        2. Gradient descent
        3. Stochastic gradient descent
        4. Regularization techniques
          1. Dropout
          2. Batch normalization
          3. L1 and L2 normalization
      5. Convolutional Neural Network
        1. Kernel
        2. Max pooling
      6. Recurrent neural network
        1. Long-Short Term Memory
    2. TensorFlow
      1. General Purpose – Graphics Processing Unit
        1. CUDA
        2. cuDNN
      2. Installation
      3. Hello world!
      4. Adding two numbers
      5. TensorBoard
      6. The Keras library
    3. Summary
  10. Semantic Embedding Using Shallow Models
    1. Word vectors
      1. The classical approach
      2. Word2vec
      3. The CBOW model
      4. The skip-gram model
        1. A comparison of skip-gram and CBOW model architectures
        2. Building a skip-gram model
        3. Visualization of word embeddings
    2. From word to document embeddings
    3. Sentence2vec
    4. Doc2vec
      1. Visualization of document embeddings
    5. Summary
  11. Text Classification Using LSTM
    1. Data for text classification
    2. Topic modeling
      1. Topic modeling versus text classification
    3. Deep learning meta architecture for text classification
      1. Embedding layer
      2. Deep representation
      3. Fully connected part
    4. Identifying spam in YouTube video comments using RNNs
    5. Classifying news articles by topic using a CNN
    6. Transfer learning using GloVe embeddings
    7. Multi-label classification
      1. Binary relevance
      2. Deep learning for multi-label classification
      3. Attention networks for document classification
    8. Summary
  12. Searching and DeDuplicating Using CNNs
    1. Data
      1. Data description
    2. Training the model
      1. Encoding the text
      2. Modeling with CNN
      3. Training
      4. Inference
    3. Summary
  13. Named Entity Recognition Using Character LSTM
    1. NER with deep learning
      1. Data
      2. Model
        1. Word embeddings
      3. Walking through the code
        1. Input
        2. Word embedding
      4. The effects of different pretrained word embeddings
        1. Neural network architecture
        2. Decoding predictions
        3. The training step
      5. Scope for improvement
    2. Summary
  14. Text Generation and Summarization Using GRUs
    1. Generating text using RNNs
      1. Generating Linux kernel code with a GRU
    2. Text summarization
      1. Extractive summarization
        1. Summarization using gensim
      2. Abstractive summarization
        1. Encoder-decoder architecture
        2. Encoder
        3. Decoder
        4. News summarization using GRU
        5. Data preparation
        6. Encoder network
        7. Decoder network
        8. Sequence to sequence
        9. Building the graph
        10. Training
        11. Inference
        12. TensorBoard visualization
      3. State-of-the-art abstractive text summarization
    3. Summary
  15. Question-Answering and Chatbots Using Memory Networks
    1. The Question-Answering task
      1. Question-Answering datasets
    2. Memory networks for Question-Answering
      1. Memory network pipeline overview
      2. Writing a memory network in TensorFlow
        1. Class constructor
        2. Input module
        3. Question module
        4. Memory module
        5. Output module
        6. Putting it together
    3. Extending memory networks for dialog modeling
      1. Dialog datasets
        1. The bAbI dialog dataset
          1. Raw data format
      2. Writing a chatbot in TensorFlow
        1. Loading dialog datasets in the QA format
        2. Vectorizing the data
        3. Wrapping the memory network model in a chatbot class
          1. Class constructor
          2. Building a vocabulary for word embedding lookup
          3. Training the chatbot model
          4. Evaluating the chatbot on the testing set
          5. Interacting with the chatbot
          6. Putting it all together
          7. Example of an interactive conversation
      3. Literature on and related to memory networks
    4. Summary
  16. Machine Translation Using the Attention-Based Model
    1. Overview of machine translation
      1. Statistical machine translation
        1. English to French using NLTK SMT models
      2. Neural machine translation
        1. Encoder-decoder network
        2. Encoder-decoder with attention
        3. NMT for French to English using attention
          1. Data preparation
          2. Encoder network
          3. Decoder network
          4. Sequence-to-sequence model
          5. Building the graph
          6. Training
          7. Inference
          8. TensorBoard visualization
    2. Summary
  17. Speech Recognition Using DeepSpeech
    1. Overview of speech recognition
    2. Building an RNN model for speech recognition
      1. Audio signal representation
      2. LSTM model for spoken digit recognition
      3. TensorBoard visualization
      4. Speech to text using the DeepSpeech architecture
        1. Overview of the DeepSpeech model
        2. Speech recordings dataset
        3. Preprocessing the audio data
        4. Creating the model
        5. TensorBoard visualization
      5. State-of-the-art in speech recognition
    3. Summary
  18. Text-to-Speech Using Tacotron
    1. Overview of text to speech
      1. Naturalness versus intelligibility
      2. How is the performance of a TTS system evaluated?
      3. Traditional techniques – concatenative and parametric models
      4. A few reminders on spectrograms and the mel scale
    2. TTS in deep learning
      1. WaveNet, in brief
      2. Tacotron
        1. The encoder
        2. The attention-based decoder
        3. The Griffin-Lim-based postprocessing module
        4. Details of the architecture
        5. Limitations
    3. Implementation of Tacotron with Keras
      1. The dataset
      2. Data preparation
        1. Preparation of text data
        2. Preparation of audio data
      3. Implementation of the architecture
        1. Pre-net
        2. Encoder and postprocessing CBHG
        3. Attention RNN
        4. Decoder RNN
        5. The attention mechanism
        6. Full architecture, with attention
      4. Training and testing
    4. Summary
  19. Deploying Trained Models
    1. Increasing performance
      1. Quantizing the weights
      2. MobileNets
    2. TensorFlow Serving
      1. Exporting the trained model
      2. Serving the exported model
    3. Deploying in the cloud
      1. Amazon Web Services
      2. Google Cloud Platform
    4. Deploying on mobile devices
      1. iPhone
      2. Android
    5. Summary
  20. Other Books You May Enjoy
    1. Leave a review - let other readers know what you think

Product information

  • Title: Hands-On Natural Language Processing with Python
  • Author(s): Rajesh Arumugam, Rajalingappaa Shanmugamani
  • Release date: July 2018
  • Publisher(s): Packt Publishing
  • ISBN: 9781789139495