O'Reilly logo
  • Victor Bos thinks this is interesting:

Accuracy is then defined as the sum of the number of true positives and true negatives divided by the total number of examples

From

Cover of Machine Learning, 2nd Edition

Note

(2.2) defines the accuracy as the sum of the number of true positives and false positives divided by the total number of data points.

The text defines accuracy as the number of correctly classified data points divided by the total number of data points. This is probably correct and (2.2) is incorrect.