How it works...

We used the UCI admission data with LogisticRegressionWithLBFGS() to predict whether a student will be admitted or not. The intercept was set to false and the .run() and .predict() API is used to predict with the fitted model. The point here was that L-BFGS is suitable for an extremely large number of parameters, particularly when there is a lot of sparsity. Regardless of what optimization technique was used, we emphasized again that ridge regression reduces the parameter weight, but never sets it to zero.

The Signature for this method constructor is as follows:

LogisticRegressionWithLBFGS ()

L-BFGS optimization in Spark, L-BFGS(), is based on Newton's optimization algorithm (uses curvature in addition to 2nd derivative of ...

Get Apache Spark 2.x Machine Learning Cookbook now with the O’Reilly learning platform.

O’Reilly members experience books, live events, courses curated by job role, and more from O’Reilly and nearly 200 top publishers.