Hessian barrier algorithms for linearly constrained optimization problems

Immanuel M. Bomze, Panayotis Mertikopoulos, Werner Schachinger, Mathias Staudigl

Research output: Contribution to journalArticleAcademicpeer-review

Abstract

In this paper, we propose an interior-point method for linearly constrained and possibly nonconvex-optimization problems. The method which we call the Hessian barrier algorithm (HBA)-combines a forward Euler discretization of Hessian-Riemannian gradient flows with an Armijo backtracking step-size policy. In this way, HBA can be seen as an alternative to mirror descent, and contains as special cases the affine scaling algorithm, regularized Newton processes, and several other iterative solution methods. Our main result is that, modulo a nondegeneracy condition, the algorithm converges to the problem's critical set; hence, in the convex case, the algorithm converges globally to the problem's minimum set. In the case of linearly constrained quadratic programs (not necessarily convex), we also show that the method's convergence rate is O(1/k rho) for some rho is an element of (0, 1] that depends only on the choice of kernel function (i.e., not on the problem's primitives). These theoretical results are validated by numerical experiments in standard nonconvex test functions and large-scale traffic assignment problems.

Original languageEnglish
Pages (from-to)2100-2127
Number of pages28
JournalSiam Journal on Optimization
Volume29
Issue number3
DOIs
Publication statusPublished - 2019

Keywords

  • Hessian-Riemannian gradient descent
  • interior-point methods
  • mirror descent
  • nonconvex optimization
  • traffic assignment
  • DYNAMICAL-SYSTEMS
  • GRADIENT
  • CONVEX
  • CONVERGENCE
  • DESCENT
  • FLOWS

Cite this

Bomze, Immanuel M. ; Mertikopoulos, Panayotis ; Schachinger, Werner ; Staudigl, Mathias. / Hessian barrier algorithms for linearly constrained optimization problems. In: Siam Journal on Optimization. 2019 ; Vol. 29, No. 3. pp. 2100-2127.
@article{c9854bf21c1a43b9b377945f2975b10f,
title = "Hessian barrier algorithms for linearly constrained optimization problems",
abstract = "In this paper, we propose an interior-point method for linearly constrained and possibly nonconvex-optimization problems. The method which we call the Hessian barrier algorithm (HBA)-combines a forward Euler discretization of Hessian-Riemannian gradient flows with an Armijo backtracking step-size policy. In this way, HBA can be seen as an alternative to mirror descent, and contains as special cases the affine scaling algorithm, regularized Newton processes, and several other iterative solution methods. Our main result is that, modulo a nondegeneracy condition, the algorithm converges to the problem's critical set; hence, in the convex case, the algorithm converges globally to the problem's minimum set. In the case of linearly constrained quadratic programs (not necessarily convex), we also show that the method's convergence rate is O(1/k rho) for some rho is an element of (0, 1] that depends only on the choice of kernel function (i.e., not on the problem's primitives). These theoretical results are validated by numerical experiments in standard nonconvex test functions and large-scale traffic assignment problems.",
keywords = "Hessian-Riemannian gradient descent, interior-point methods, mirror descent, nonconvex optimization, traffic assignment, DYNAMICAL-SYSTEMS, GRADIENT, CONVEX, CONVERGENCE, DESCENT, FLOWS",
author = "Bomze, {Immanuel M.} and Panayotis Mertikopoulos and Werner Schachinger and Mathias Staudigl",
note = "data source:",
year = "2019",
doi = "10.1137/18M1215682",
language = "English",
volume = "29",
pages = "2100--2127",
journal = "Siam Journal on Optimization",
issn = "1052-6234",
publisher = "Society for Industrial and Applied Mathematics Publications",
number = "3",

}

Hessian barrier algorithms for linearly constrained optimization problems. / Bomze, Immanuel M.; Mertikopoulos, Panayotis; Schachinger, Werner; Staudigl, Mathias.

In: Siam Journal on Optimization, Vol. 29, No. 3, 2019, p. 2100-2127.

Research output: Contribution to journalArticleAcademicpeer-review

TY - JOUR

T1 - Hessian barrier algorithms for linearly constrained optimization problems

AU - Bomze, Immanuel M.

AU - Mertikopoulos, Panayotis

AU - Schachinger, Werner

AU - Staudigl, Mathias

N1 - data source:

PY - 2019

Y1 - 2019

N2 - In this paper, we propose an interior-point method for linearly constrained and possibly nonconvex-optimization problems. The method which we call the Hessian barrier algorithm (HBA)-combines a forward Euler discretization of Hessian-Riemannian gradient flows with an Armijo backtracking step-size policy. In this way, HBA can be seen as an alternative to mirror descent, and contains as special cases the affine scaling algorithm, regularized Newton processes, and several other iterative solution methods. Our main result is that, modulo a nondegeneracy condition, the algorithm converges to the problem's critical set; hence, in the convex case, the algorithm converges globally to the problem's minimum set. In the case of linearly constrained quadratic programs (not necessarily convex), we also show that the method's convergence rate is O(1/k rho) for some rho is an element of (0, 1] that depends only on the choice of kernel function (i.e., not on the problem's primitives). These theoretical results are validated by numerical experiments in standard nonconvex test functions and large-scale traffic assignment problems.

AB - In this paper, we propose an interior-point method for linearly constrained and possibly nonconvex-optimization problems. The method which we call the Hessian barrier algorithm (HBA)-combines a forward Euler discretization of Hessian-Riemannian gradient flows with an Armijo backtracking step-size policy. In this way, HBA can be seen as an alternative to mirror descent, and contains as special cases the affine scaling algorithm, regularized Newton processes, and several other iterative solution methods. Our main result is that, modulo a nondegeneracy condition, the algorithm converges to the problem's critical set; hence, in the convex case, the algorithm converges globally to the problem's minimum set. In the case of linearly constrained quadratic programs (not necessarily convex), we also show that the method's convergence rate is O(1/k rho) for some rho is an element of (0, 1] that depends only on the choice of kernel function (i.e., not on the problem's primitives). These theoretical results are validated by numerical experiments in standard nonconvex test functions and large-scale traffic assignment problems.

KW - Hessian-Riemannian gradient descent

KW - interior-point methods

KW - mirror descent

KW - nonconvex optimization

KW - traffic assignment

KW - DYNAMICAL-SYSTEMS

KW - GRADIENT

KW - CONVEX

KW - CONVERGENCE

KW - DESCENT

KW - FLOWS

U2 - 10.1137/18M1215682

DO - 10.1137/18M1215682

M3 - Article

VL - 29

SP - 2100

EP - 2127

JO - Siam Journal on Optimization

JF - Siam Journal on Optimization

SN - 1052-6234

IS - 3

ER -