Coordinate Descent Method for Large-scale L2-loss Linear SVM

Kai-Wei Chang, Cho-Jui Hsieh, and Chih-Jen Lin, in JMLR, 2008.


[Full Text]

Abstract

Linear support vector machines (SVM) are useful for classifying large-scale sparse data. Problems with sparse features are common in applications such as document classification and natural language processing. In this paper, we propose a novel coordinate descent algorithm for training linear SVM with the L2-loss function. At each step, the proposed method minimizes a one-variable sub-problem while fixing other variables. The sub-problem is solved by Newton steps with the line search technique. The procedure globally converges at the linear rate. As each sub-problem involves only values of a corresponding feature, the proposed approach is suitable when accessing a feature is more convenient than accessing an instance. Experiments show that our method is more efficient and stable than state of the art methods such as Pegasos and TRON.

Bib Entry

@inproceedings{ChangHsLi08,
  author = {Chang, Kai-Wei and Hsieh, Cho-Jui and Lin, Chih-Jen},
  title = {Coordinate Descent Method for Large-scale L2-loss Linear SVM},
  booktitle = {JMLR},
  year = {2008}
}

Links