|
Convergence of the partition function in the static word embedding model
K. Mynbaeva, Zh. Assylbekovb a International School of Economics,
Kazakh-British Technical University,
59 Tolebi St,
050000 Almaty, Kazakhstan
b Department of Mathematics,
School of Sciences and Humanities,
Nazarbayev University,
53 Kabanbay Batyr Ave,
010000 Astana, Kazakhstan
Abstract:
We develop an asymptotic theory for the partition function of the word embeddings model WORD2VEC. The proof involves a study of properties of matrices, their determinants and distributions of random normal vectors when their dimension tends to infinity. The conditions imposed are mild enough to cover practically important situations. The implication is that for any word $i$ from a vocabulary $\mathcal{W}$, the context vector $\mathbf{c}_i$ is a reflection of the word vector $\mathbf{w}_i$ in approximately half of the dimensions. This allows us to halve the number of trainable parameters in static word embedding models.
Keywords and phrases:
word embeddings, partition function, neural networks, WORD2VEC, asymptotic distribution.
Received: 07.07.2022
Citation:
K. Mynbaev, Zh. Assylbekov, “Convergence of the partition function in the static word embedding model”, Eurasian Math. J., 13:4 (2022), 70–81
Linking options:
https://www.mathnet.ru/eng/emj455 https://www.mathnet.ru/eng/emj/v13/i4/p70
|
Statistics & downloads: |
Abstract page: | 94 | Full-text PDF : | 73 | References: | 14 |
|