Skip to yearly menu bar Skip to main content


Poster
in
Workshop: Optimization for ML Workshop

Differentially Private Random Block Coordinate Descent

Artavazd Maranjyan · Abdurakhmon Sadiev · Peter Richtarik


Abstract:

Due to their effectiveness in solving high-dimensional problems and their ability to decompose complex optimization problems, coordinate Descent (CD) methods have garnered significant interest in machine learning in the last decade. However, classical CD methods were not designed nor analyzed with any data privacy considerations in mind, while these are increasingly critical in managing sensitive information. This gap recently led to the development of differentially private CD methods, such as DP-CD proposed by Mangold et al. (ICML 2022). Despite this progress, there remains a disparity between non-private CD and DP-CD methods. Our work proposes differentially private random block coordinate descent that allows for the selection of multiple coordinates with varying probabilities in each iteration using sketch matrices.

Chat is not available.