Poster
Clustering in Causal Attention Masking
Nikita Karagodin · Yury Polyanskiy · Philippe Rigollet
This work presents a modification of the self-attention dynamics proposed in Geshkovski et al to better reflect the practically relevant causally masked attention used in transformer architectures for generative AI. This modification translates into an interacting particle system that cannot be interpreted as a mean-field gradient flow. Despite this loss of structure, we significantly strengthen the results of Geshkovski et al in this context: While previous rigorous results focused on cases where all three matrices (key, query, and value) were scaled identities, we prove asymptotic convergence to a single cluster for arbitrary key-query matrices and value matrix equal to the identity.Additionally, we establish a connection to the classical R\'enyi parking problem from combinatorial geometry to make initial theoretical steps towards demonstrating the existence of meta-stable states.
Live content is unavailable. Log in and register to view live content