Deep Neural Networks with Trainable Activations and Controlled Lipschitz Constant
S. Aziznejad, H. Gupta, J. Campos, M. Unser
IEEE Transactions on Signal Processing, vol. 68, pp. 4688-4699, August 10, 2020.
We introduce a variational framework to learn the activation functions of deep neural networks. Our aim is to increase the capacity of the network while controlling an upper-bound of the actual Lipschitz constant of the input-output relation. To that end, we first establish a global bound for the Lipschitz constant of neural networks. Based on the obtained bound, we then formulate a variational problem for learning activation functions. Our variational problem is infinite-dimensional and is not computationally tractable. However, we prove that there always exists a solution that has continuous and piecewise-linear (linear-spline) activations. This reduces the original problem to a finite-dimensional minimization where an ℓ1 penalty on the parameters of the activations favors the learning of sparse nonlinearities. We numerically compare our scheme with standard ReLU network and its variations, PReLU and LeakyReLU and we empirically demonstrate the practical aspects of our framework.
|
@ARTICLE(http://bigwww.epfl.ch/publications/aziznejad2001.html,
AUTHOR="Aziznejad, S. and Gupta, H. and Campos, J. and Unser, M.",
TITLE="Deep Neural Networks with Trainable Activations and Controlled
{L}ipschitz Constant",
JOURNAL="{IEEE} Transactions on Signal Processing",
YEAR="2020",
volume="68",
number="",
pages="4688--4699",
month="August 10,",
note="")
©
2020
IEEE.
Personal use of this material is permitted. However, permission to reprint/republish this material for advertising or promotional purposes or for creating new collective works for resale or redistribution to servers or lists, or to reuse any copyrighted component of this work in other works must be obtained from
IEEE.
This material is presented to ensure timely dissemination of scholarly and technical work. Copyright and all rights therein are retained by authors or by other copyright holders. All persons copying this information are expected to adhere to the terms and constraints invoked by each author's copyright. In most cases, these works may not be reposted without the explicit permission of the copyright holder.
|