Poster
in
Workshop: Workshop on Distribution Shifts: Connecting Methods and Applications
Relational Out-of-Distribution Generalization
Xinyu Yang · Xinyi Pan · Shengchao Liu · Huaxiu Yao
In out-of-distribution (OOD) generalization, domain relation is an important factor. It can provide a global view on the functionality among domains, e.g., the protein domain in the binding affinity task or the geographical location domain in the weather forecast task. Existing work lacks the utilization of the domain relation; yet in this work, we want to explore how to incorporate such rich information into solving the distribution shift problem. Therefore, we propose READ, a general multi-head deep learning framework that harnesses domain relation to generalize to unseen domains in a structured learning and inference manner. In READ, each training domain shares a common backbone but learns one separate head. Built on a proposed explicit regularization, READ simulates the generalization process among heads, where a weighted ensemble prediction from heads irrelevant to input domain is calculated via domain relation and aligned with the target. To improve the reliability of domain relation, READ further leverages similarity metric learning to update initial relation. Empirically, we evaluate READ on three domain generalization benchmarks. The results indicate that READ consistently improves upon existing state-of-the-art methods on datasets from various fields.