FaRSA for minimizing convex l_1-regularized functions

Tianyi Chen and Daniel P. Robinson, Applied Mathematics and Statistics, Johns Hopkins University

Poster

We present our work on minimizing objective functions that may be written as the sum of a convex function and a sparsity inducing L1 regularizer. By using curvature information from subspaces that evolve during the solution process, we have designed an algorithm that is generally better than the state-of-the-art in terms of both robustness and efficiency. Moreover, our method ensures convergence of the iterative sequence from an arbitrary starting point as well as local superlinear convergence of the iterates. Numerical results are presented for the case of regularized logistic regression over a collection of commonly used data sets.