Deep Learning for Numerical Applications with SAS

Book description

Foreword by Oliver Schabenberger, PhD
Executive Vice President, Chief Operating Officer and Chief Technology Officer SAS

Dive into deep learning! Machine learning and deep learning are ubiquitous in our homes and workplaces—from machine translation to image recognition and predictive analytics to autonomous driving. Deep learning holds the promise of improving many everyday tasks in a variety of disciplines. Much deep learning literature explains the mechanics of deep learning with the goal of implementing cognitive applications fueled by Big Data. This book is different. Written by an expert in high-performance analytics, Deep Learning for Numerical Applications with SAS introduces a new field: Deep Learning for Numerical Applications (DL4NA). Contrary to deep learning, the primary goal of DL4NA is not to learn from data but to dramatically improve the performance of numerical applications by training deep neural networks.

Deep Learning for Numerical Applications with SAS presents deep learning concepts in SAS along with step-by-step techniques that allow you to easily reproduce the examples on your high-performance analytics systems. It also discusses the latest hardware innovations that can power your SAS programs: from many-core CPUs to GPUs to FPGAs to ASICs.

This book assumes the reader has no prior knowledge of high-performance computing, machine learning, or deep learning. It is intended for SAS developers who want to develop and run the fastest analytics. In addition to discovering the latest trends in hybrid architectures with GPUs and FPGAS, readers will learn how to

  • Use deep learning in SAS
  • Speed up their analytics using deep learning
  • Easily write highly parallel programs using the many task computing paradigms

This book is part of the SAS Press program.

Table of contents

  1. Preface
  2. About This Book
  3. About The Author
  4. Acknowledgments
  5. Chapter 1: Introduction
    1. Deep Learning
    2. Is Deep Learning for You?
    3. It’s All about Performance
      1. Flynn’s Taxonomy
      2. Life after Flynn
    4. Organization of This Book
  6. Chapter 2: Deep Learning
    1. Deep Learning
      1. Connectionism
      2. The Perceptron
      3. The First AI Winter
      4. The Experts to the Rescue
      5. The Second AI Winter
      6. The Deeps
      7. The Third AI Winter
      8. Some Supervision Required
    2. A Few Words about CAS
      1. Deployment Models
      2. CAS Sessions
      3. Caslibs
      4. Workers
      5. Action Sets and Actions
      6. Cleanup
    3. All about the Data
      1. The Men Body Mass Index Data Set
      2. The IRIS Data Set
    4. Logistic Regression
      1. Preamble
      2. Create the ANN
      3. Training
      4. Inference
    5. Conclusion
  7. Chapter 3: Regressions
    1. A Brief History of Regressions
    2. All about the Data (Reprise)
      1. The CARS Data Set
    3. A Simple Regression
    4. The Universal Approximation Theorem
      1. Universal Approximation Framework
    5. Approximation of a Continuous Function
    6. Conclusions
  8. Chapter 4: Many-Task Computing
    1. A Taxonomy for Parallel Programs
    2. Tasks Are the New Threads
      1. What Is a Task?
      2. Inputs and Outputs
      3. Immutable Inputs
      4. What Is a Job Flow?
    3. Examples of Job Flows
    4. Mutable Inputs
    5. Task Revisited
    6. Partitioning
    7. Federated Areas
    8. Persistent Area
    9. Caveats and Pitfalls
      1. Not Declaring Your Inputs
      2. Not Treating Your Immutable Inputs as Immutable
      3. Not Declaring Your Outputs
    10. Performance of Grid Scheduling
    11. Data-Object Pooling
    12. Portable Learning
    13. Conclusion
  9. Chapter 5: Monte Carlo Simulations
    1. Monte Carlo or Las Vegas?
    2. Random Walk
    3. Multi-threaded Random Walk
      1. SAS Studio
      2. Live ETL
      3. A Parallel Program
      4. A Parallel Program with Partitions
      5. Many Cores
    4. Conclusion
  10. Chapter 6: GPU
    1. History of GPUs
      1. The Golden Age of the Multicore
      2. The Golden Age of the Graphics Card
      3. The Golden Age of the GPU
    2. The CUDA Programming Model
    3. Hello π
      1. The CUDA Toolkit
      2. Buffon Revisited
    4. Generating Random Walk Data with CUDA
    5. Putting It All Together
    6. Conclusion
  11. Chapter 7: Monte Carlo Simulations with Deep Learning
    1. Generating Data
      1. Training Data
      2. Testing Data
    2. Training the Network
    3. Inference Using the Network
    4. Performance Summary
    5. Other Examples
      1. Pricing of American Options
      2. Pricing of Variable Annuities Contracts
    6. Conclusion
  12. Chapter 8: Deep Learning for Numerical Applications in the Enterprise
    1. Enterprise Applications
    2. A Task
      1. Data
      2. Task Implementation
    3. A Simple Flow
    4. A Training Flow Task
    5. An Inference Flow
    6. Documentation
    7. Heterogeneous Architectures
    8. Collaboration with Federated Areas
    9. Deploying DL with Federated Areas
    10. Conclusions
  13. Chapter 9: Conclusions
    1. Data-Driven Programming
    2. The Quest for Speed
      1. From Tasks to GPUs
      2. Training and Inference
      3. FPGA
      4. Hybrid Architectures
  14. Appendix A: Development Environment Setup
    1. LINUX
    2. Windows
  15. References
  16. Index

Product information

  • Title: Deep Learning for Numerical Applications with SAS
  • Author(s): Henry Bequet
  • Release date: July 2018
  • Publisher(s): SAS Institute
  • ISBN: 9781635266771