Putting it all together

Now we have all the pieces. Let's look at how to put it all together:

  1. We first ingest the dataset and then split the data out into training and cross validation sets. The dataset is split into ten parts for a k-fold cross-validation. We won't do that. Instead, we'll do a single fold cross-validation by holding out 30% of the data for cross-validation:
  typ := "bare"  examples, err := ingest(typ)  log.Printf("errs %v", err)  log.Printf("Examples loaded: %d", len(examples))  shuffle(examples)  cvStart := len(examples) - len(examples)/3  cv := examples[cvStart:]  examples = examples[:cvStart]
  1. We then train the classifier and then check to see whether the classifier can predict its own dataset well:
  c := New() c.Train(examples) ...

Get Go Machine Learning Projects now with the O’Reilly learning platform.

O’Reilly members experience books, live events, courses curated by job role, and more from O’Reilly and nearly 200 top publishers.