Poster
Sample-Conditioned Hypothesis Stability Sharpens Information-Theoretic Generalization Bounds
Ziqiao Wang · Yongyi Mao
Great Hall & Hall B1+B2 (level 1) #1800
Abstract:
We present new information-theoretic generalization guarantees through the a novel construction of the "neighboring-hypothesis" matrix and a new family of stability notions termed sample-conditioned hypothesis (SCH) stability. Our approach yields sharper bounds that improve upon previous information-theoretic bounds in various learning scenarios. Notably, these bounds address the limitations of existing information-theoretic bounds in the context of stochastic convex optimization (SCO) problems, as explored in the recent work by Haghifam et al. (2023).
Chat is not available.