Open Access Academic Publishing | Indexed in Google Scholar | CC BY-NC-ND 4.0
Book Chapter

Transformer-Based Frameworks for Automated Code Generation and Software Optimization

Download PDF
D. Mahitha
Assistant Professor, School of Computer Science and Engineering, Malla Reddy Engineering College for Women, Maisammaguda, Secunderabad, Telangana, India.
mahithadilli@gmail.com
Pages: 136-153
Keywords: Transformer-Based Code Generation; Software Optimization; Code Intelligence; Automated Programming; Structure-Aware Encoder

Abstract

The accelerating demand for efficient and scalable software development has catalyzed the exploration of AI-driven solutions for automating complex programming tasks. This chapter presents a comprehensive study on the application of transformer-based frameworks for automated code generation and software optimization. We examine the ability of these models to translate high-level natural language descriptions and formal specifications into executable, high-quality code. The chapter introduces a novel transformer-based methodology that integrates a structure-aware encoder with a dedicated optimization module to enhance both code generation accuracy and runtime performance. We evaluate our proposed model against several leading benchmarks, including HumanEval, MBPP, and CodeXGLUE, demonstrating significant improvements over existing state-of-the-art models like CodeBERT, GraphCodeBERT, and AlphaCode. Our findings reveal that the proposed framework excels in capturing programming intent, generating context-aware code, and performing automated refactoring to optimize for execution speed and memory efficiency. The results and discussion section provides an in-depth analysis of performance metrics, error distribution, and the trade-offs between model size and accuracy. By synthesizing current advancements and addressing existing limitations, this work contributes to the evolving field of code intelligence and highlights future directions for developing more robust, generalizable, and trustworthy AI systems for software development.

References

  1. Hadi Ghaemi et al. “Transformers in source code generation: A comprehensive survey”. In: Journal of Systems Architecture 153 (2024), p. 103193.
  2. Vaswani Ashish. “Attention is all you need”. In: Advances in neural information processing systems 30 (2017), p. I.
  3. Mark Chen. “Evaluating large language models trained on code”. In: arXiv preprint arXiv:2107.03374 (2021).
  4. Shuai Lu et al. “Codexglue: A machine learning benchmark dataset for code understanding and generation”. In: arXiv preprint arXiv:2102.04664 (2021).
  5. Zhangyin Feng et al. “Codebert: A pre-trained model for programming and natural languages”. In: arXiv preprint arXiv:2002.08155 (2020).
  6. Daya Guo et al. “Graphcodebert: Pre-training code representations with data ow”. In: arXiv preprint arXiv:2009.08366 (2020).
  7. Yue Wang et al. “Codet5: Identifier-aware unified pre-trained encoder-decoder models for code understanding and generation”. In: arXiv preprint arXiv:2109.00859 (2021).
  8. Yujia Li et al. “Competition-level code generation with alphacode”. In: Science 378.6624 (2022), pp. 1092–1097.
  9. Sotiris Kotsiantis, Vassilios Verykios, and Manolis Tzagarakis. “AI-assisted programming tasks using code embeddings and transformers”. In: Electronics 13.4 (2024), p. 767.
Next-Generation Artificial Intelligence: From Foundations to Intelligent Applications Next-Generation Artificial Intelligence: From Foundations to Intelligent Applications