Chapter 19. Large-Scale Machine Learning
Jerod J. Weinman, Augustus Lidaka and Shitanshu Aggarwal
A typical machine-learning algorithm creates a classification function that inductively generalizes from training examples — input features and associated classification labels — to previously unseen examples requiring labels. Optimizing the prediction accuracy of the learned function for complex problems can require massive amounts of training data. This chapter describes a GPU-based implementation of a discriminative maximum entropy learning algorithm that can improve runtime on large datasets by a factor of over 200.
19.1. Introduction
Machine learning is used on a variety of problems, including time series prediction for financial forecasting ...

Get GPU Computing Gems Emerald Edition now with the O’Reilly learning platform.

O’Reilly members experience books, live events, courses curated by job role, and more from O’Reilly and nearly 200 top publishers.