%0 Journal Article
%A Burger, Martin
%A Kabri, Samira
%A Korolev, Yury
%A Roith, Tim
%A Weigand, Lukas
%T Analysis of mean-field models arising from self-attention dynamics in transformer architectures with layer normalization
%J Philosophical transactions of the Royal Society of London / Series A
%V 383
%N 2298
%@ 1364-503X
%C London
%I Royal Soc.
%M PUBDB-2025-01273
%P 20240233
%D 2025
%Z ISSN 1471-2962 not unique: **2 hits**.
%X The aim of this paper is to provide a mathematical analysis of transformer architectures using aself-attention mechanism with layer normalization. In particular, observed patterns in such architecturesresembling either clusters or uniform distributions pose a number of challenging mathematical questions.We focus on a special case that admits a gradient flow formulation in the spaces of probability measureson the unit sphere under a special metric, which allows us to give at least partial answers in a rigorousway. The arising mathematical problems resemble those recently studied in aggregation equations, butwith additional challenges emerging from restricting the dynamics to the sphere and the particular formof the interaction energy.We provide a rigorous framework for studying the gradient flow, which also suggests a possible metricgeometry to study the general case (i.e. one that is not described by a gradient flow). We further analyzethe stationary points of the induced self-attention dynamics. The latter are related to stationary pointsof the interaction energy in the Wasserstein geometry, and we further discuss energy minimizers andmaximizers in different parameter settings.
%F PUB:(DE-HGF)16
%9 Journal Article
%R 10.1098/rsta.2024.0233
%U https://bib-pubdb1.desy.de/record/626051