Konstantin Mishchenko
Konstantin Mishchenko
Verified email at kaust.edu.sa - Homepage
Title
Cited by
Cited by
Year
Tighter Theory for Local SGD on Identical and Heterogeneous Data
A Khaled, K Mishchenko, P Richtarik
International Conference on Artificial Intelligence and Statistics, 4519-4529, 2020
85*2020
Stochastic Distributed Learning with Gradient Quantization and Variance Reduction
S Horváth, D Kovalev, K Mishchenko, S Stich, P Richtárik
arXiv preprint arXiv:1904.05115, 2019
592019
Distributed Learning with Compressed Gradient Differences
K Mishchenko, E Gorbunov, M Takáč, P Richtárik
arXiv preprint arXiv:1901.09269, 2019
562019
First Analysis of Local GD on Heterogeneous Data
A Khaled, K Mishchenko, P Richtárik
NeurIPS FL Workshop, arXiv preprint arXiv:1909.04715, 2019
472019
SEGA: Variance Reduction via Gradient Sketching
F Hanzely, K Mishchenko, P Richtárik
Advances in Neural Information Processing Systems, 2082-2093, 2018
322018
A Delay-tolerant Proximal-Gradient Algorithm for Distributed Learning
K Mishchenko, F Iutzeler, J Malick, MR Amini
International Conference on Machine Learning, 3584-3592, 2018
222018
Revisiting Stochastic Extragradient
K Mishchenko, D Kovalev, E Shulgin, P Richtárik, Y Malitsky
AISTATS 2020, 2019
182019
99% of worker-master communication in distributed optimization is not needed
K Mishchenko, F Hanzely, P Richtárik
Conference on Uncertainty in Artificial Intelligence, 979-988, 2020
17*2020
A distributed flexible delay-tolerant proximal gradient algorithm
K Mishchenko, F Iutzeler, J Malick
SIAM Journal on Optimization 30 (1), 933-959, 2020
172020
Stochastic Newton and cubic Newton methods with simple local linear-quadratic rates
D Kovalev, K Mishchenko, P Richtárik
arXiv preprint arXiv:1912.01597, 2019
152019
Random Reshuffling: Simple Analysis with Vast Improvements
K Mishchenko, A Khaled, P Richtárik
Advances in Neural Information Processing Systems 33, 2020
122020
Adaptive Gradient Descent without Descent
K Mishchenko, Y Malitsky
37th International Conference on Machine Learning (ICML 2020), 2020
12*2020
MISO is Making a Comeback With Better Proofs and Rates
X Qian, A Sailanbayev, K Mishchenko, P Richtárik
arXiv preprint arXiv:1906.01474, 2019
92019
DAve-QN: A Distributed Averaged Quasi-Newton Method with Local Superlinear Convergence Rate
S Soori, K Mischenko, A Mokhtari, MM Dehnavi, M Gurbuzbalaban
AISTATS 2020, 2019
82019
A Stochastic Decoupling Method for Minimizing the Sum of Smooth and Non-Smooth Functions
K Mishchenko, P Richtárik
arXiv preprint arXiv:1905.11535, 2019
82019
Dualize, split, randomize: Fast nonsmooth optimization algorithms
A Salim, L Condat, K Mishchenko, P Richtárik
arXiv preprint arXiv:2004.02635, 2020
62020
Sinkhorn Algorithm as a Special Case of Stochastic Mirror Descent
K Mishchenko
NeurIPS OTML Workshop, arXiv preprint arXiv:1909.06918, 2019
42019
A Stochastic Penalty Model for Convex and Nonconvex Optimization with Big Constraints
K Mishchenko, P Richtárik
arXiv preprint arXiv:1810.13387, 2018
42018
A Self-supervised Approach to Hierarchical Forecasting with Applications to Groupwise Synthetic Controls
K Mishchenko, M Montgomery, F Vaggi
Time Series Workshop at International Conference on Machine Learning, 2019
22019
Proximal and Federated Random Reshuffling
K Mishchenko, A Khaled, P Richtárik
arXiv preprint arXiv:2102.06704, 2021
12021
The system can't perform the operation now. Try again later.
Articles 1–20