IRIT, CNRS, Toulouse
Date(s) : 11/06/2021 iCal
14 h 30 min - 15 h 30 min
A powerful strategy to boost the performance of sparse optimization algorithms is known as safe screening: it allows the early identification of zero coordinates in the solution, which can then be eliminated to reducethe problem’s size and accelerate convergence. In this work, we extend the existing Gap Safe screening framework by relaxing the global strong-concavity assumption on the dual cost function. Instead, we exploit local regularity properties, that is, strong concavityon well-chosen subsets of the domain. The non-negativity constraint is also integrated to the existing framework. Besides making safe screening possible to a broader class of functions that includes beta-divergences (e.g., the Kullback-Leibler divergence),the proposed approach also improves upon the existing Gap Safe screening rules on previously applicable cases (e.g., logistic regression).
Joint work with Cassio Dantas and Cédric Févotte.