Poster
in
Affinity Event: Black in AI
Many-Shot In-Context Learning: A Solution or Just an Enhancement for Low-Resource NLP?
Henok Ademtew · Mikiyas Birbo
Large language models (LLMs) excel at few-shot in-context learning (ICL) – learning from a few examples provided in context at inference, without weight updates. However, their performance in low-resource languages remains substantially poor. The rise of long-context models allows us to investigate many-shot ICL for low-resource languages. In this work, we present an analysis of many-shot ICL in Amharic language, a low-resource language, across two tasks: sentiment classification and machine translation. We observe performance gains from few to many shots, demonstrated by improved task performance metrics. Our analysis reveals that many-shot learning enhances model performance.
Live content is unavailable. Log in and register to view live content