Open Access Academic Publishing | Indexed in Google Scholar | CC BY-NC-ND 4.0
Book Chapter

Zero-Shot and Few-Shot Learning Approaches Using Large Language Models for Low-Resource Languages

Download PDF
Mrs. Geetha R
Assistant Professor, Department of Computer Science and Engineering (AI & ML), Nagarjuna College of Engineering and Technology, Bengaluru, Karnataka, India.
geetha.r@ncetmail.com
Pages: 58-72
Keywords: Low-Resource Languages; Zero-Shot Learning; Few-Shot Learning; Prompt Engineering; Large Language Models.

Abstract

The proliferation of Large Language Models (LLMs) has revolutionized the field of Natural Language Processing (NLP), yet their benefits remain largely concentrated in high-resource languages like English. This chapter addresses the critical challenge of applying LLMs to low-resource languages, which lack the extensive digital data required for traditional model training. We explore the efficacy of zero-shot and fewshot learning as powerful, data-efficient paradigms for unlocking the capabilities of LLMs in these under-served linguistic contexts. This chapter provides a comprehensive overview of the theoretical underpinnings of zero-shot and few-shot learning, followed by a detailed review of the current state-of-the-art. We propose a structured methodology centered on advanced prompt engineering techniques to maximize performance on a variety of NLP tasks, including translation, sentiment analysis, and named entity recognition. Through a series of experiments on several low-resource African languages (Swahili, Yoruba, Hausa, Zulu, and Amharic) using benchmark datasets like FLORES-200, we demonstrate that few-shot learning significantly outperforms zero-shot approaches and, in some cases, can approach the performance of fully supervised models without the need for extensive labeled data. The results highlight the critical role of in-context learning and prompt design in bridging the performance gap. This chapter concludes with a discussion of the practical implications, current limitations, and future directions for creating more equitable and inclusive language technologies.

References

  1. Ashish Vaswani et al. “Attention is all you need”. In: Advances in neural information processing systems 30 (2017).
  2. Pratik Joshi et al. “The state and fate of linguistic diversity and inclusion in the NLP world”. In: arXiv preprint arXiv:2004.09095 (2020).
  3. Tom Brown et al. “Language models are few-shot learners”. In: Advances in neural information processing systems 33 (2020), pp. 1877–1901.
  4. Long Ouyang et al. “Training language models to follow instructions with human feedback”. In: Advances in neural information processing systems 35 (2022), pp. 27730–27744.
  5. Alexis Conneau et al. “Unsupervised cross-lingual representation learning at scale”. In: (2020), pp. 8440–8451.
  6. Jacob Devlin et al. “Bert: Pre-training of deep bidirectional transformers for language understanding”. In: Proceedings of the 2019 conference of the North American chapter of the association for computational linguistics: human language technologies, volume 1 (long and short papers). 2019, pp. 4171–4186.
  7. Zabir Al Nazi, Md Rajib Hossain, and Faisal Al Mamun. “Evaluation of open and closed-source LLMs for low-resource language with zero-shot, few-shot, and chain-of-thought prompting”. In: Natural Language Processing Journal 10 (2025), p. 100124.
  8. Darani Rajasekhar et al. “An Improved Machine Learning and Deep Learning based Breast Cancer Detection using Thermographic Images”. In: 2023 Second International Conference on Electronics and Renewable Systems (ICEARS). IEEE. 2023, pp. 1152–1157.
  9. Saedeh Tahery and Saeed Farzi. “An Adapted Few-Shot Prompting Technique Using ChatGPT to Advance Low-Resource Languages Understanding”. In: IEEE Access (2025).
  10. Devalla Bhaskar Ganesh et al. “Enhancing NLP for Low-Resource Languages using Cross-Lingual Transfer and Few-Shot Learning”. In: 2025 5th International Conference on Soft Computing for Security Applications (ICSCSA). IEEE. 2025, pp. 1203– 1207.
Next-Generation Artificial Intelligence: From Foundations to Intelligent Applications Next-Generation Artificial Intelligence: From Foundations to Intelligent Applications