Skip to yearly menu bar Skip to main content


Poster

FuseMoE: Mixture-of-Experts Transformers for Fleximodal Fusion

Xing Han · Huy Nguyen · Carl Harris · Nhat Ho · Suchi Saria

East Exhibit Hall A-C #1312
[ ]
Thu 12 Dec 11 a.m. PST — 2 p.m. PST

Abstract:

As machine learning models in critical fields increasingly grapple with multimodal data, they face the dual challenges of handling a wide array of modalities, often incomplete due to missing elements, and the temporal irregularity and sparsity of collected samples. Successfully leveraging this complex data, while overcoming the scarcity of high-quality training samples, is key to improving these models' predictive performance. We introduce ``FuseMoE'', a mixture-of-experts framework incorporated with an innovative gating function. Designed to integrate a diverse number of modalities, FuseMoE is effective in managing scenarios with missing modalities and irregularly sampled data trajectories. Theoretically, our unique gating function contributes to enhanced convergence rates, leading to better performance in multiple downstream tasks. The practical utility of FuseMoE in the real world is validated by a diverse set of challenging prediction tasks.

Live content is unavailable. Log in and register to view live content