Initialising ...
Initialising ...
Initialising ...
Initialising ...
Initialising ...
Initialising ...
Initialising ...
Nagai, Yuki; Tanaka, Akinori*; Tomiya, Akio*
Physical Review D, 107(5), p.054501_1 - 054501_16, 2023/03
Times Cited Count:3 Percentile:84.41(Astronomy & Astrophysics)no abstracts in English
Nagai, Yuki; Tomiya, Akio*
no journal, ,
We proposed a new self-learning Monte Carlo method using Transformer, a key technology in generative AI. By using Transformer's Attention mechanism, which can infer the relevance of distant words in a sentence, we have constructed an effective model that can efficiently capture the long-range correlations that are important in the phase transitions in electronic systems. Furthermore, we reduce the number of parameters by incorporating symmetries that the system must satisfy, such as spin rotation and spatial translation, into the network. We also found a scaling law that the loss decreases as the number of layers is increased.
Nagai, Yuki; Tomiya, Akio*
no journal, ,
We proposed a new self-learning Monte Carlo method using Transformer, a key technology in generative AI. By using Transformer's Attention mechanism, which can infer the relevance of distant words in a sentence, we have constructed an effective model that can efficiently capture the long-range correlations that are important in the phase transitions in electronic systems. Furthermore, we reduce the number of parameters by incorporating symmetries that the system must satisfy, such as spin rotation and spatial translation, into the network. We also found a scaling law that the loss decreases as the number of layers is increased. In this talk, we discuss a possible application for lattice quantum chromodynamics of this technology.