Chen-Yu Ho
Title
Cited by
Cited by
Year
Scaling distributed machine learning with in-network aggregation
A Sapio, M Canini, CY Ho, J Nelson, P Kalnis, C Kim, A Krishnamurthy, ...
arXiv preprint arXiv:1903.06701, 2019
562019
Natural compression for distributed deep learning
S Horvath, CY Ho, L Horvath, AN Sahu, M Canini, P Richtarik
arXiv preprint arXiv:1905.10988, 2019
332019
On the discrepancy between the theoretical analysis and practical implementations of compressed communication for distributed deep learning
A Dutta, EH Bergou, AM Abdelmoniem, CY Ho, AN Sahu, M Canini, ...
Proceedings of the AAAI Conference on Artificial Intelligence 34 (04), 3817-3824, 2020
182020
Compressed communication for distributed deep learning: Survey and quantitative evaluation
H Xu, CY Ho, AM Abdelmoniem, A Dutta, EH Bergou, K Karatsenidis, ...
132020
On the Impact of Device and Behavioral Heterogeneity in Federated Learning
AM Abdelmoniem, CY Ho, P Papageorgiou, M Bilal, M Canini
arXiv preprint arXiv:2102.07500, 2021
2021
Efficient Sparse Collective Communication and its application to Accelerate Distributed Deep Learning
J Fei, CY Ho, AN Sahu, M Canini, A Sapio
2020
Published work in the proceedings of AAAI 2020 The Discrepancy between the Theoretical Analysis and Practical Implementations of Compressed Communication for Distributed Deep …
CY Ho, AN Sahu, M Canini, P Kalnis
2020
On the Discrepancy between the Theoretical Analysis and Practical Implementations of Compressed Communication for Distributed Deep Learning
CY Ho, AN Sahu, M Canini, P Kalnis
2020
sands-lab/layer-wise-aaai20: Code repository for AAAI'20 paper: On the Discrepancy between the Theoretical Analysis and Practical Implementations of Compressed Communication …
A Dutta, EH Bergou, AM Abdelmoniem, CY Ho, AN Sahu, M Canini, ...
Github, 2019
2019
IntML: Natural Compression for Distributed Deep Learning
S Horváth, CY Ho, L Horváth, AN Sahu, M Canini, P Richtárik
Training 1 (2), 3, 0
The system can't perform the operation now. Try again later.
Articles 1–10