Self-Concordant Analysis of Frank-Wolfe Algorithms

P. Dvurechensky, P. Ostroukhov, K. Safin, S. Shtern, M. Staudigl*

*Corresponding author for this work

Research output: Chapter in Book/Report/Conference proceedingConference article in proceedingAcademicpeer-review

Abstract

Projection-free optimization via different variants of the Frank-Wolfe (FW), a.k.a. Conditional Gradient method has become one of the cornerstones in optimization for machine learning since in many cases the linear minimization oracle is much cheaper to implement than projections and some sparsity needs to be preserved. In a number of applications, e.g. Poisson inverse problems or quantum state tomography, the loss is given by a self-concordant (SC) function having unbounded curvature, implying absence of theoretical guarantees for the existing FW methods. We use the theory of SC functions to provide a new adaptive step size for FW methods and prove global convergence rate O(1/k) after k iterations. If the problem admits a stronger local linear minimization oracle, we construct a novel FW method with linear convergence rate for SC functions.
Original languageEnglish
Title of host publicationProceedings of the 37th International Conference on Machine Learning
Subtitle of host publicationInternational Conference on Machine Learning, 13-18 July 2020, Virtual
EditorsHal Daumé III, Aarti Singh
PublisherProceedings of Machine Learning Research
Pages2814-2824
Number of pages11
Volume119
Publication statusPublished - 2019
Event25th Americas Conference on Information Systems of the Association-for-Information-Systems( AMCIS) - Cancun, Cancun, Mexico
Duration: 15 Aug 201917 Aug 2019

Conference

Conference25th Americas Conference on Information Systems of the Association-for-Information-Systems( AMCIS)
Country/TerritoryMexico
CityCancun
Period15/08/1917/08/19

Keywords

  • OPTIMIZATION
  • CONVERGENCE
  • COMPLEXITY

Cite this