Please use this identifier to cite or link to this item:
Authors: Gökgöz, Çağla
Perkgöz, Cahit
Alaçam, Umut Can
Keywords: Code Generation
Deep Learning
Natural Language Processing
Abstract Syntax Tree
Issue Date: 2022
Abstract: Machine Learning has attracted researchers in the last decades and has been applied to different problems in many fields. Deep Learning methods which is a subfield of Machine Learning have started to be utilized to solve complex and hard problems with the improvement of computer technologies. Natural language processing is one of the challenging tasks that still needs to be improved for different applications such as code generation. Recently, general-purpose transformer based autoregressive language models achieved promising results on natural language generation tasks. Code generation from natural utterance using deep learning methods could be a promising development in terms of decreasing mental effort and time spent. In this study, a layered approach to generate Cascading Styles Sheets rules is proposed. The abstract data is obtained using a large-scale language model from natural utterances. Then the information is encoded into Abstract Syntax Tree. Finally, Abstract Syntax Tree structure is decoded in order to generate the Cascading Styles Sheets rules. In order to measure the performance of the proposed method an experimental procedure is constructed. Using pre-trained transformers and generated training data for Cascading Styles Sheets rules, different tests are applied to different datasets and the accuracies are obtained. Promising results for Cascading Styles Sheets code generation tasks using structural and natural prompt design are achieved. 46.98% and 66.07% overall accuracies are obtained for structural and natural prompt designs, respectively.
ISSN: 2687-6167
Appears in Collections:Bilgisayar Mühendisliği Bölümü Koleksiyonu
TR-Dizin İndeksli Yayınlar Koleksiyonu

Show full item record

CORE Recommender

Google ScholarTM


Items in GCRIS Repository are protected by copyright, with all rights reserved, unless otherwise indicated.