Skip to yearly menu bar Skip to main content


Poster
in
Workshop: Foundation Models for Science: Progress, Opportunities, and Challenges

In-Context Learning for Function Approximation with DeepSet-ONet

Shao-Ting Chiu · Junyuan Hong · Ulisses M. Braga-Neto

Keywords: [ In-Context Learning ] [ DeepSets ] [ Deep Operator Network ]


Abstract:

We propose an efficient approach to in-context learning for function approximation using a novel DeepSets Operator Network (DeepSet-ONet) architecture, which adds in-context learning capabilities to DeepONets by combining it with the DeepSets architecture. Compared with a popular transformer-based model, DeepSet-ONet reduces the number of model weights by an order of magnitude and requires a fraction of the training time. Furthermore, DeepSet-Onet is less sensitive to noise, and in fact outperforms the transformer model in high-noise settings. Experiments with in-context learning of classical linear regression demonstrate that DeepSet-ONet can be efficiently trained from in-context examples and accurately perform regression on noisy data at inference time. This study highlights the potential of DeepSet-ONet as a lightweight, high-performance framework for in-context learning.

Chat is not available.