Deep neural network approximation theory D Elbrächter, D Perekrestenko, P Grohs, H Bölcskei arXiv preprint arXiv:1901.02220, 2019 | 180 | 2019 |
DNN expression rate analysis of high-dimensional PDEs: application to option pricing D Elbrächter, P Grohs, A Jentzen, C Schwab Constructive Approximation 55 (1), 3-71, 2022 | 102 | 2022 |
Group testing for SARS-CoV-2 allows for up to 10-fold efficiency increase across realistic scenarios and testing strategies CM Verdun, T Fuchs, P Harar, D Elbrächter, DS Fischer, J Berner, ... Frontiers in Public Health 9, 583377, 2021 | 49 | 2021 |
The universal approximation power of finite-width deep ReLU networks D Perekrestenko, P Grohs, D Elbrächter, H Bölcskei arXiv preprint arXiv:1806.01528, 2018 | 38 | 2018 |
How degenerate is the parametrization of neural networks with the ReLU activation function? J Berner, D Elbrächter, P Grohs Advances in Neural Information Processing Systems, 7790-7801, 2019 | 30* | 2019 |
Towards a regularity theory for ReLU networks–chain rule and global error estimates J Berner, D Elbrächter, P Grohs, A Jentzen 2019 13th International conference on Sampling Theory and Applications …, 2019 | 13 | 2019 |
Redistributor: Transforming Empirical Data Distributions P Harar, D Elbrächter, M Dörfler, KD Johnson arXiv preprint arXiv:2210.14219, 2022 | | 2022 |
The Oracle of DLphi D Alfke, W Baines, J Blechschmidt, MJ Sarmina, A Drory, D Elbrächter, ... arXiv preprint arXiv:1901.05744, 2019 | | 2019 |