On the convergence of stochastic forward-backward-forward algorithms with variance reduction in pseudo-monotone variational inequalities

Mathias Staudigl, Radu Ioan Bot, Panayotis Mertikopoulos, Phan Tuo Vuong

Research output: Chapter in Book/Report/Conference proceedingConference article in proceedingAcademicpeer-review

Abstract

We develop a new stochastic algorithm with variance reduction for solving pseudo- monotone stochastic variational inequalities. Our method builds on Tseng’s forward-backward-forward algorithm, which is known in the deterministic lit- erature to be a valuable alternative to Korpelevich’s extragradient method when solving variational inequalities over a convex and closed set governed with pseudo- monotone and Lipschitz continuous operators. The main computational advantage of Tseng’s algorithm is that it relies only on a single projection step, and two independent queries of a stochastic oracle. Our algorithm incorporates a variance reduction mechanism, and leads to a.s. convergence to solutions of a merely pseudo-monotone stochastic variational inequality problem. To the best of our knowledge, this is the first stochastic algorithm achieving this by using only a single projection at each iteration
Original languageEnglish
Title of host publicationAdvances in Neural Information Processing Systems (NIPS)
PublisherNeural Information Processing Systems Foundation
Number of pages5
Publication statusPublished - 2018

Cite this

Staudigl, M., Bot, R. I., Mertikopoulos, P., & Vuong, P. T. (2018). On the convergence of stochastic forward-backward-forward algorithms with variance reduction in pseudo-monotone variational inequalities. In Advances in Neural Information Processing Systems (NIPS) Neural Information Processing Systems Foundation . http://polaris.imag.fr/panayotis.mertikopoulos/files/FBF-NIPS.pdf