Poster
in
Workshop: Workshop on Distribution Shifts: Connecting Methods and Applications
Instance norm improves meta-learning in class-imbalanced land cover classification
Marc Russwurm · Devis Tuia
Distribution shift is omnipresent in geographic data, where various climatic and cultural factors lead to different representations across the globe. We aim to adapt dynamically to unseen data distributions with model-agnostic meta-learning, where data sampled from each distribution is seen as a task with only a few annotated samples. Transductive batch normalization layers are often employed in meta-learning models, as they reach the highest numerical accuracy on the class-balanced target tasks used as meta-learning benchmarks. In this work, we demonstrate empirically that transductive batch normalization collapses when deployed on a real class-imbalanced land cover classification problem. We propose a solution to replace batch normalization with instance normalization. This modification consistently outperformed all other normalization alternatives across different meta-learning algorithms in our class-imbalanced land cover classification test tasks.