Oral
in
Workshop: Compositional Learning: Perspectives, Methods, and Paths Forward
Scalable and interpretable quantum natural language processing: an implementation on trapped ions
Tiffany Duneau · Saskia Bruhn · Gabriel Matos · Tuomas Laakkonen · Katerina Saiti · Anna Pearson · Konstantinos Meichanetzidis · Bob Coecke
Keywords: [ quantum natural language processing ] [ quantum computing ] [ compositional generalisation ] [ interpetability ] [ compositionality ] [ natural language processing ] [ machine learning ]
We present a compositional implementation of natural language processing tasks on a quantum computer using the QDisCoCirc model. QDisCoCirc is a model that allows for both compositional generalisation - the ability to generalise outside the training distribution by learning compositional rules underpinning the entire data distribution - and compositional interpretability - making sense of how the model works by inspecting its modular components in isolation and the processes through which they are combined. We consider the task of question-answering for which we handcraft a toy dataset. The model components are trained on classical computers at small scales, then composed to generate larger test instances, which are evaluated on Quantinuum's H1-1 trapped-ion quantum processor. We inspect the trained models by comparing them to manually-constructed perfect compositional models, and identify where and why our model learned compositional behaviours. As an initial baseline comparison, we considered small-scale Transformer and LSTM models, as well as GPT-4, none of which succeeded at compositional generalisation on this task.