Fang-Lan Huang, Cho-Jui Hsieh, Kai-Wei Chang, Chih-Jen Lin

JMLR

Links

### Abstract Maximum entropy (Maxent) is useful in natural language processing and many other areas. Iterative scaling (IS) methods are one of the most popular approaches to solve Maxent. With many variants of IS methods, it is difficult to understand them and see the differences. In this paper, we create a general and unified framework for iterative scaling methods. This framework also connects iterative scaling and coordinate descent methods. We prove general convergence results for IS methods and analyze their computational complexity. Based on the proposed framework, we extend a coordinate descent method for linear SVM to Maxent. Results show that it is faster than existing iterative scaling methods. ### Bib entry > @article{HHCL10,
> author = {Fang-Lan Huang and Cho-Jui Hsieh and Kai-Wei Chang and Chih-Jen Lin},
> title= {{Iterative Scaling and Coordinate Descent Methods for Maximum Entropy Models}},
> journal = {JMLR},
> year = {2010}
> } ### Notes A short version appears as a short paper at ACL 2009.