|
Big Transformers for Code Generation
G. A. Arutyunov, S. M. Avdoshin National Research University Higher School of Economics
Abstract:
IT industry has been thriving over the past decades. Numerous new programming languages have emerged, new architectural patterns and software development techniques. Tools involved in the process ought to evolve as well. One of the key principles of new generation of instruments for software development would be the ability of the tools to learn using neural networks. First of all, it is necessary for the tools to learn how to write code. In this work we study the ability of Transformers to generate competition level code. The main goal is to discover whether open-source Big Transformers are “naturally” good coders.
Keywords:
neural networks, code generation, Transformers, GPT
Citation:
G. A. Arutyunov, S. M. Avdoshin, “Big Transformers for Code Generation”, Proceedings of ISP RAS, 34:4 (2022), 79–88
Linking options:
https://www.mathnet.ru/eng/tisp706 https://www.mathnet.ru/eng/tisp/v34/i4/p79
|
Statistics & downloads: |
Abstract page: | 21 | Full-text PDF : | 10 |
|