Machine Learning

Book description

Machine learning, one of the top emerging sciences, has an extremely broad range of applications. However, many books on the subject provide only a theoretical approach, making it difficult for a newcomer to grasp the subject material. This book provides a more practical approach by explaining the concepts of machine learning algorithms and describing the areas of application for each algorithm, using simple practical examples to demonstrate each algorithm and showing how different issues related to these algorithms are applied.

Table of contents

  1. Cover
  2. Half Title
  3. Title Page
  4. Copyright Page
  5. Table of Contents
  6. Dedication
  7. Preface
  8. Acknowledgments
  9. Authors
  10. Introduction
  11. 1 Introduction to Machine Learning
    1. 1.1 Introduction
    2. 1.2 Preliminaries
      1. 1.2.1 Machine Learning: Where Several Disciplines Meet
      2. 1.2.2 Supervised Learning
      3. 1.2.3 Unsupervised Learning
      4. 1.2.4 Semi-Supervised Learning
      5. 1.2.5 Reinforcement Learning
      6. 1.2.6 Validation and Evaluation
    3. 1.3 Applications of Machine Learning Algorithms
      1. 1.3.1 Automatic Recognition of Handwritten Postal Codes
      2. 1.3.2 Computer-Aided Diagnosis
      3. 1.3.3 Computer Vision
        1. 1.3.3.1 Driverless Cars
        2. 1.3.3.2 Face Recognition and Security
      4. 1.3.4 Speech Recognition
      5. 1.3.5 Text Mining
        1. 1.3.5.1 Where Text and Image Data Can Be Used Together
    4. 1.4 The Present and the Future
      1. 1.4.1 Thinking Machines
      2. 1.4.2 Smart Machines
      3. 1.4.3 Deep Blue
      4. 1.4.4 IBM’s Watson
      5. 1.4.5 Google Now
      6. 1.4.6 Apple’s Siri
      7. 1.4.7 Microsoft’s Cortana
    5. 1.5 Objective of This Book
    6. References
  12. SECTION I SUPERVISED LEARNING ALGORITHMS
    1. 2 Decision Trees
      1. 2.1 Introduction
      2. 2.2 Entropy
        1. 2.2.1 Example
        2. 2.2.2 Understanding the Concept of Number of Bits
    2. 2.3 Attribute Selection Measure
      1. 2.3.1 Information Gain of ID3
      2. 2.3.2 The Problem with Information Gain
    3. 2.4 Implementation in MATLAB®
      1. 2.4.1 Gain Ratio of C4.5
      2. 2.4.2 Implementation in MATLAB
    4. References
  13. 3 Rule-Based Classifiers
    1. 3.1 Introduction to Rule-Based Classifiers
    2. 3.2 Sequential Covering Algorithm
    3. 3.3 Algorithm
    4. 3.4 Visualization
    5. 3.5 Ripper
      1. 3.5.1 Algorithm
      2. 3.5.2 Understanding Rule Growing Process
      3. 3.5.3 Information Gain
      4. 3.5.4 Pruning
      5. 3.5.5 Optimization
    6. References
  14. 4 Naïve Bayesian Classification
    1. 4.1 Introduction
    2. 4.2 Example
    3. 4.3 Prior Probability
    4. 4.4 Likelihood
    5. 4.5 Laplace Estimator
    6. 4.6 Posterior Probability
    7. 4.7 MATLAB Implementation
    8. References
  15. 5 The k-Nearest Neighbors Classifiers
    1. 5.1 Introduction
    2. 5.2 Example
    3. 5.3 k-Nearest Neighbors in MATLAB®
    4. References
  16. 6 Neural Networks
    1. 6.1 Perceptron Neural Network
      1. 6.1.1 Perceptrons
    2. 6.2 MATLAB Implementation of the Perceptron Training and Testing Algorithms
    3. 6.3 Multilayer Perceptron Networks
    4. 6.4 The Backpropagation Algorithm
      1. 6.4.1 Weights Updates in Neural Networks
    5. 6.5 Neural Networks in MATLAB
    6. References
  17. 7 Linear Discriminant Analysis
    1. 7.1 Introduction
    2. 7.2 Example
    3. References
  18. 8 Support Vector Machine
    1. 8.1 Introduction
    2. 8.2 Definition of the Problem
      1. 8.2.1 Design of the SVM
      2. 8.2.2 The Case of Nonlinear Kernel
    3. 8.3 The SVM in MATLAB®
    4. References
  19. SECTION II UNSUPERVISED LEARNING ALGORITHMS
    1. 9 k-Means Clustering
      1. 9.1 Introduction
      2. 9.2 Description of the Method
      3. 9.3 The k-Means Clustering Algorithm
      4. 9.4 The k-Means Clustering in MATLAB®
    2. 10 Gaussian Mixture Model
      1. 10.1 Introduction
      2. 10.2 Learning the Concept by Example
      3. References
    3. 11 Hidden Markov Model
      1. 11.1 Introduction
      2. 11.2 Example
      3. 11.3 MATLAB Code
      4. References
    4. 12 Principal Component Analysis
      1. 12.1 Introduction
      2. 12.2 Description of the Problem
      3. 12.3 The Idea behind the PCA
        1. 12.3.1 The SVD and Dimensionality Reduction
      4. 12.4 PCA Implementation
        1. 12.4.1 Number of Principal Components to Choose
        2. 12.4.2 Data Reconstruction Error
      5. 12.5 The Following MATLAB® Code Applies the PCA
      6. 12.6 Principal Component Methods in Weka
      7. 12.7 Example: Polymorphic Worms Detection Using PCA
        1. 12.7.1 Introduction
        2. 12.7.2 SEA, MKMP, and PCA
        3. 12.7.3 Overview and Motivation for Using String Matching
        4. 12.7.4 The KMP Algorithm
        5. 12.7.5 Proposed SEA
        6. 12.7.6 An MKMP Algorithm
          1. 12.7.6.1 Testing the Quality of the Generated Signature for Polymorphic Worm A
        7. 12.7.7 A Modified Principal Component Analysis
          1. 12.7.7.1 Our Contributions in the PCA
          2. 12.7.7.2 Testing the Quality of Generated Signature for Polymorphic Worm A
          3. 12.7.7.3 Clustering Method for Different Types of Polymorphic Worms
        8. 12.7.8 Signature Generation Algorithms Pseudo-Codes
          1. 12.7.8.1 Signature Generation Process
      8. References
  20. Appendix I: Transcript of Conversations with Chatbot
  21. Appendix II: Creative Chatbot
  22. Index

Product information

  • Title: Machine Learning
  • Author(s): Mohssen Mohammed, Muhammad Badruddin Khan, Eihab Mohammed Bashier
  • Release date: August 2016
  • Publisher(s): CRC Press
  • ISBN: 9781315354415