Net impact of large language models trained on code
Închide
Articolul precedent
Articolul urmator
471 44
Ultima descărcare din IBN:
2024-04-08 18:51
SM ISO690:2012
GHERCIU, Pavel. Net impact of large language models trained on code. In: Conferinţa tehnico-ştiinţifică a studenţilor, masteranzilor şi doctoranzilor, 29-31 martie 2022, Chișinău. Chișinău, Republica Moldova: Tehnica-UTM, 2022, Vol.1, pp. 189-192. ISBN 978-9975-45-828-3..
EXPORT metadate:
Google Scholar
Crossref
CERIF

DataCite
Dublin Core
Conferinţa tehnico-ştiinţifică a studenţilor, masteranzilor şi doctoranzilor
Vol.1, 2022
Conferința "Conferinţa tehnico-ştiinţifică a studenţilor, masteranzilor şi doctoranzilor"
Chișinău, Moldova, 29-31 martie 2022

Net impact of large language models trained on code


Pag. 189-192

Gherciu Pavel
 
Technical University of Moldova
 
 
Disponibil în IBN: 22 iulie 2022


Rezumat

Natural language processing has seen many improvements in recent years, particularly driven by machine learning models such as OpenAI’s GPT-3. This paper aims to present the various language models, as well as OpenAI Codex, which is considered to be an AI revolution in the field of programming. This system has been trained on Python code from more than 50 million GitHub repositories and is capable of generating and explaining code, translating it between various programming languages and more. In addition, the benefits and potential dangers of its use will be analysed and presented.

Cuvinte-cheie
Artificial Intelligence, code generation, language models, OpenAI, CODEX, natural language processing