Exploiting local regularity properties to boost and expand safe-screening
Emmanuel Soubies
Emmanuel Soubies
• 2021-10-11
AbstractA powerful strategy to boost the performance of sparse optimization algorithms is known as safe screening: it allows the early identification of zero coordinates in the solution, which can then be eliminated to reduce the size of the problem and accelerate convergence. In this work, we extend the existing Gap Safe screening framework by relaxing the global strong-concavity assumption on the dual cost function. Instead, we exploit local regularity properties, that is, strong concavity on well-chosen subsets of the domain. The non-negativity constraint is also integrated to the existing framework. Besides making safe screening possible to a broader class of functions that includes beta-divergences (e.g., the Kullback-Leibler divergence), the proposed approach also improves upon the existing Gap Safe screening rules on previously applicable cases (e.g., logistic regression). Joint work with Cassio Dantas and Cédric Févotte.