Lipschitz Function Approximation using DeepSpline Neural Networks
S. Neumayer
EPFL-CIS & RIKEN-AIP Joint Workshop on Machine Learning (JWML'22), Virtual, September 7-8, 2022.
In this talk, we investigate NNs with prescribed bounds on the Lipschitz constant. One possibility to obtain Lipschitz-constrained NNs is to impose constraints on the architecture. Here, it turns out that this significantly limits the expressivity if we use the popular ReLU activation function. In particular, we are unable to represent even simple continuous piece-wise linear functions. On the contrary, using learnable linear splines instead fixes this problem and leads to maximal expressivity among all component-wise activation functions. From the many possible applications of Lipschitz-constrained NNs, we discuss one in more detail to see that the theoretical observations also transition into improved performance.
@INPROCEEDINGS(http://bigwww.epfl.ch/publications/neumayer2203.html, AUTHOR="Neumayer, S.", TITLE="{L}ipschitz Function Approximation using {DeepSpline} Neural Networks", BOOKTITLE="{EPFL-CIS} \& {RIKEN-AIP} Joint Workshop on Machine Learning ({JWML'22})", YEAR="2022", editor="", volume="", series="", pages="", address="Virtual", month="September 7-8,", organization="", publisher="", note="")