Skip to yearly menu bar Skip to main content


Poster
in
Workshop: ML with New Compute Paradigms

Multi-Task Neural Network Mapping onto Analog-Digital Heterogeneous Accelerators

Hadjer Benmeziane · Corey Lammie · Athanasios Vasilopoulos · Irem Boybat · Manuel Le Gallo · Hsinyu Tsai · Kaoutar El Maghraoui · Abu Sebastian

[ ] [ Project Page ]
Sun 15 Dec noon PST — 1:40 p.m. PST

Abstract:

Multi-task Learning (MTL) models are increasingly popular for their ability to perform multiple tasks using shared parameters, significantly reducing redundant computations and resource utilization. These models are particularly advantageous for analog-digital heterogeneous systems, where shared parameters can be mapped onto weight-stationary analog cores.This paper introduces a novel framework, entitled Multi-task Heterogeneous Layer Mapping, designed to strategically map MTL models onto an accelerator that integrates analog in-memory computing cores and digital processing units. Our framework incorporates a training process that increases task similarity and account for analog non-idealities using hardware-aware training. In the subsequent mapping phase, deployment on the accelerator is optimized for resource allocation and model performance, leveraging feature similarity and importance. Experiments on the COCO, UCI, and BelgiumTS datasets demonstrate that this approach reduces model parameters by up to 3× while maintaining performance within 0.03% of task-specific models.

Chat is not available.