Poster
Multi-Instance Partial-Label Learning with Margin Adjustment
Wei Tang · Yin-Fang Yang · Zhaofei Wang · Weijia Zhang · Min-Ling Zhang
Multi-instance partial-label learning (MIPL) is an emerging learning framework where each training sample is represented as a multi-instance bag associated with a candidate label set. Existing MIPL algorithms often overlook the margins for attention scores and predicted probabilities, leading to suboptimal generalization performance. A critical issue with these algorithms is that the highest prediction probability of the classifier may appear on a non-candidate label. In this paper, we propose an algorithm named MIPLMA, i.e., Multi-Instance Partial-Label learning with dual Margin Adjustment, which adjusts the margins for attention scores and predicted probabilities. We introduce a margin-aware attention mechanism to dynamically adjust the margins for attention scores and propose a margin-compliant loss to constrain the margins between the predicted probabilities on candidate and non-candidate label sets. Experimental results on benchmark and real-world datasets demonstrate the superior performance of MIPLMA over existing MIPL algorithms, as well as other well-established multi-instance learning algorithms and partial-label learning algorithms. The source code of MIPLMA is included in the supplementary material and will be publicly accessible.
Live content is unavailable. Log in and register to view live content