Adrien Taylor
Adrien Taylor
Inria - ENS Paris
Verified email at inria.fr - Homepage
Title
Cited by
Cited by
Year
Smooth strongly convex interpolation and exact worst-case performance of first-order methods
AB Taylor, JM Hendrickx, F Glineur
Mathematical Programming 161 (1-2), 307-345, 2017
1192017
Exact worst-case performance of first-order methods for composite convex optimization
AB Taylor, JM Hendrickx, F Glineur
SIAM Journal on Optimization 27 (3), 1283-1313, 2017
77*2017
On the worst-case complexity of the gradient method with exact line search for smooth strongly convex functions
E De Klerk, F Glineur, AB Taylor
Optimization Letters 11 (7), 1185-1199, 2017
452017
Stochastic first-order methods: non-asymptotic and computer-aided analyses via potential functions
A Taylor, F Bach
Proceedings of the Thirty-Second Conference on Learning Theory (COLT), 2019
442019
Exact worst-case convergence rates of the proximal gradient method for composite convex minimization
AB Taylor, JM Hendrickx, F Glineur
Journal of Optimization Theory and Applications 178 (2), 455-476, 2018
432018
Operator splitting performance estimation: Tight contraction factors and optimal parameter selection
EK Ryu, AB Taylor, C Bergeling, P Giselsson
SIAM Journal on Optimization 30 (3), 2251-2271, 2020
422020
Optimal complexity and certification of Bregman first-order methods
RA Dragomir, AB Taylor, A d’Aspremont, J Bolte
Mathematical Programming, 1-43, 2021
302021
Performance estimation toolbox (PESTO): automated worst-case analysis of first-order optimization methods
AB Taylor, JM Hendrickx, F Glineur
2017 IEEE 56th Annual Conference on Decision and Control (CDC), 1278-1283, 2017
302017
Convex interpolation and performance estimation of first-order methods for convex optimization.
AB Taylor
Catholic University of Louvain, Louvain-la-Neuve, Belgium, 2017
282017
Efficient first-order methods for convex minimization: a constructive approach
Y Drori, AB Taylor
Mathematical Programming 184 (1), 183-220, 2020
262020
Lyapunov functions for first-order methods: Tight automated convergence guarantees
A Taylor, B Van Scoy, L Lessard
International Conference on Machine Learning (ICML) 80, 4897--4906, 2018
252018
Acceleration methods
A d'Aspremont, D Scieur, A Taylor
Foundations and Trends® in Optimization 5 (1-2), 1-245, 2021
222021
Worst-case convergence analysis of inexact gradient and Newton methods through semidefinite programming performance estimation
E De Klerk, F Glineur, AB Taylor
SIAM Journal on Optimization 30 (3), 2053-2082, 2020
192020
An optimal gradient method for smooth strongly convex minimization
A Taylor, Y Drori
arXiv preprint arXiv:2101.09741, 2021
102021
Complexity Guarantees for Polyak Steps with Momentum
M Barré, A Taylor, A d'Aspremont
Proceedings of the Thirty-Third Conference on Learning Theory (COLT), 2020
92020
Principled analyses and design of first-order methods with inexact proximal operators
M Barré, A Taylor, F Bach
arXiv preprint arXiv:2006.06041, 2020
72020
On the oracle complexity of smooth strongly convex minimization
Y Drori, A Taylor
Journal of Complexity 68, 101590, 2022
52022
Convergence of constrained anderson acceleration
M Barré, A Taylor, A d'Aspremont
arXiv preprint arXiv:2010.15482, 2020
32020
Continuized Accelerations of Deterministic and Stochastic Gradient Descents, and of Gossip Algorithms
M Even, R Berthier, F Bach, N Flammarion, H Hendrikx, P Gaillard, ...
Advances in Neural Information Processing Systems 34, 2021
1*2021
A note on approximate accelerated forward-backward methods with absolute and relative errors, and possibly strongly convex objectives
M Barré, A Taylor, F Bach
arXiv preprint arXiv:2106.15536, 2021
12021
The system can't perform the operation now. Try again later.
Articles 1–20