Skip to yearly menu bar Skip to main content


Poster
in
Workshop: Optimization for ML Workshop

Dual Feature Reduction for the Sparse-Group Lasso and its Adaptive Variant

Fabio Feser · Marina Evangelou


Abstract:

The sparse-group lasso (SGL) performs both variable and group selection. It has found widespread use in many fields, due to its sparse-group penalty, which allows it to utilize grouping information and shrink inactive variables in active groups. However, SGL can be computationally expensive, due to the added shrinkage complexity. This paper introduces a feature reduction approach for SGL and the adaptive SGL, Dual Feature Reduction (DFR), which applies strong screening rules to reduce the input space before optimization. DFR applies two layers of screening and is based on dual norms. Through synthetic and real numerical studies, it is shown that DFR is the state-of-the-art screening rule for SGL by drastically reducing the computational cost under many different scenarios, outperforming other existing methods.

Chat is not available.