TY  - THES
AU  - Roith, Tim
TI  - Consistency, Robustness and Sparsity for Learning Algorithms
PB  - Friedrich-Alexander-Universität Erlangen-Nürnberg
VL  - Dissertation
M1  - PUBDB-2025-01281
SP  - 217
PY  - 2024
N1  - Dissertation, Friedrich-Alexander-Universität Erlangen-Nürnberg, 2024
AB  - This thesis is concerned with consistency, robustness and sparsity of supervised and semi-supervised learning algorithms. For the latter, we consider the so-called Lipschitz learning task (Nadler, Boaz, Nathan Srebro, and Xueyuan Zhou. 'Statistical analysis of semi-supervised learning: The limit of infinite unlabelled data.' Advances in neural information processing systems 22 (2009)) for which we prove Gamma convergence and convergence rates for discrete solutions to their continuum counterpart in the infinite data limit. In the supervised regime, we deal with input-robustness w.r.t. adversarial attacks and resolution changes. For the multi-resolution setting, we analyze the role of Fourier neural operators (Li, Zongyi, et al. 'Fourier neural operator for parametric partial differential equations.' arXiv preprint arXiv:2010.08895 (2020).) and their connection to standard convolutional neural layers. Concerning the computational complexity of neural network training, we propose an algorithm based on Bregman iterations (Osher, Stanley, et al. 'An iterative regularization method for total variation-based image restoration.' Multiscale Modeling </td><td width="150">
AB  -  Simulation 4.2 (2005)) that allows for sparse weight matrices throughout the training. We also provide the convergence analysis for the stochastic adaption of the original Bregman iterations.
KW  - Machine Learning (Other)
KW  - Consistency (Other)
KW  - Sparsity (Other)
KW  - Robustness (Other)
KW  - DDC Classification::5 Naturwissenschaften::50 Naturwissenschaften::500 Naturwissenschaften und Mathematik (Other)
LB  - PUB:(DE-HGF)11
DO  - DOI:10.25593/OPEN-FAU-522
UR  - https://bib-pubdb1.desy.de/record/626059
ER  -