An example of PCA with Scikit-Learn

We can repeat the same experiment made with the FA and heteroscedastic noise to assess the MLE score of the PCA. We are going to use the PCA class with the same number of components (n_components=64). To achieve the maximum accuracy, we also set the  svd_solver='full' parameter, to force Scikit-Learn to apply a full SVD instead of the truncated version. In this way, the top eigenvalues are selected only after the decomposition, avoiding the risk of imprecise estimations:

from sklearn.decomposition import PCApca = PCA(n_components=64, svd_solver='full', random_state=1000)Xpca = pca.fit_transform(Xh)print(pca.score(Xh))-3772.7483580391995

The result is not surprising: the MLE is much lower than FA, because ...

Get Mastering Machine Learning Algorithms now with the O’Reilly learning platform.

O’Reilly members experience books, live events, courses curated by job role, and more from O’Reilly and nearly 200 top publishers.