PCA helps us to identify patterns in data based on the correlation between features. In a nutshell, PCA aims to find the directions of maximum variance in high-dimensional data and projects it onto a new subspace with equal or fewer dimensions than the original one. The orthogonal axes (principal components) of the new subspace can be interpreted as the directions of maximum variance given the constraint that the new feature axes are orthogonal to each other, as illustrated in the following figure:
- 5. Compressing Data via Dimensionality Reduction
- from Python Machine Learning: Perform Python Machine Learning and Deep Learning with Python, scikit-learn, and TensorFlow
- Publisher: Packt Publishing
- Released: September 2017
What does this mean?
Share this highlighthttp://learning.oreilly.com/a/python-machine-learning/18351274/