Share this page:

A Dual Coordinate Descent Method for Large-Scale Linear SVM

Cho-Jui Hsieh, Kai-Wei Chang, Chih-Jen Lin, Sathia S. Keerthi, and S. Sundararajan, in ICML, 2008.

Top-10 cited paper at ICML 08

Slides Code

Download the full text


Abstract

In many applications, data appear with a huge number of instances as well as features. Linear Support Vector Machines (SVM) is one of the most popular tools to deal with such large-scale sparse data. This paper presents a novel dual coordinate descent method for linear SVM with L1- and L2- loss functions. The proposed method is simple and reaches an e-accurate solution in O(log(1/e)) iterations. Experiments indicate that our method is much faster than state of the art solvers such as Pegasos, TRON, SVMperf , and a recent primal coordinate descent implementation.


Bib Entry

@inproceedings{HCLKS08,
  author = {Hsieh, Cho-Jui and Chang, Kai-Wei and Lin, Chih-Jen and Keerthi, Sathia S. and Sundararajan, S.},
  title = {A Dual Coordinate Descent Method for Large-Scale Linear SVM},
  booktitle = {ICML},
  year = {2008}
}

Related Publications

  1. Large Linear Classification When Data Cannot Fit In Memory, TKDD, 2012
  2. Selective Block Minimization for Faster Convergence of Limited Memory Large-scale Linear Models, KDD, 2011
  3. Iterative Scaling and Coordinate Descent Methods for Maximum Entropy Models, JMLR, 2010
  4. A Comparison of Optimization Methods and software for Large-scale L1-regularized Linear Classification, JMLR, 2010
  5. Training and Testing Low-degree Polynomial Data Mappings via Linear SVM, JMLR, 2010
  6. A Sequential Dual Method for Large Scale Multi-Class Linear SVMs, KDD, 2008
  7. Coordinate Descent Method for Large-scale L2-loss Linear SVM, JMLR, 2008
  8. LIBLINEAR: A Library for Large Linear Classification, JMLR, 2008