Research Output per year
We develop a new stochastic algorithm with variance reduction for solving pseudo-monotone stochastic variational inequalities. Our method builds on Tseng’s forward-backward-forward algorithm, which is known in the deterministic literature to be a valuable alternative to Korpelevich’s extragradient method when solving variational inequalities over a convex and closed set governed with pseudo-monotone and Lipschitz continuous operators. The main computational advantage of Tseng’s algorithm is that it relies only on a single projection step, and two independent queries of a stochastic oracle. Our algorithm incorporates a variance reduction mechanism, and leads to a.s. convergence to solutions of a merely pseudo-monotone stochastic variational inequality problem. To the best of our knowledge, this is the first stochastic algorithm achieving this by using only a single projection at each iteration.
- Variational inequalities
- forward-backward-forward algorithm
- Stochastic approximation
- variance reduction
Staudigl, M., Radu Ioan Bot & Phan Tuo Vong, 2019, arXiv.org at Cornell University Library, 34 p.
Research output: Working paper › Professional
Staudigl, M., & Mertikopoulos, P. (2019). Convergent Noisy forward-backward-forward algorithms in non-monotone variational inequalities. IFAC-PapersOnLine, 52(3), 120-125. https://doi.org/10.1016/j.ifacol.2019.06.021