Open Access Academic Publishing | Indexed in Google Scholar | CC BY-NC-ND 4.0
Book Chapter

Hybrid Intelligence for Natural Language Understanding and Low Resource Language Processing

Download PDF
Mr. M. Ratnakar Babu
Assistant Professor, Department of IT, Vidya Jyothi Institute of Technology, Hyderabad, Aziz Nagar, Telangana, India.
mratnakarbabu@gmail.com
Pages: 85-99
Keywords: Hybrid Intelligence; Natural Language Understanding; Low-Resource Languages; Transfer Learning; Symbolic AI; Machine Learning.

Abstract

Natural Language Understanding (NLU) has made significant strides in recent years, yet its application to low-resource languages remains a formidable challenge due to the scarcity of annotated data and linguistic resources. This chapter explores the potential of hybrid intelligence to address these limitations by combining the strengths of symbolic, knowledge-based approaches with data-driven machine learning models. We propose a novel hybrid methodology that integrates a symbolic knowledge base with a multilingual pre-trained language model, enhanced by a transfer learning framework. This approach is designed to improve NLU performance for low-resource languages by leveraging linguistic knowledge and transferring insights from high-resource languages. The proposed methodology is evaluated on a multilingual dataset for sentiment analysis and named entity recognition (NER) tasks, demonstrating significant improvements in performance for low-resource languages compared to traditional machine learning and symbolic methods alone. The chapter provides a comprehensive overview of the proposed hybrid model, detailed experimental results, and a discussion of the implications for the future of NLU in a multilingual context.

References

  1. Jacob Devlin et al. “Bert: Pre-training of deep bidirectional transformers for language understanding”. In: Proceedings of the 2019 conference of the North American chapter of the association for computational linguistics: human language technologies, volume 1 (long and short papers). 2019, pp. 4171–4186.
  2. Alexandre Magueresse, Vincent Carles, and Evan Heetderks. “Low-resource languages: A review of past work and future challenges”. In: arXiv preprint arXiv:2006.07264 (2020).
  3. Amir Reza Jafari et al. “Transfer Learning for Multi-lingual Tasks–a Survey”. In: arXiv preprint arXiv:2110.02052 (2021).
  4. Stefan Wermter. Hybrid connectionist natural language processing. Vol. 7. Chapman & Hall London, 1995.
  5. Daniel Jurafsky and James H Martin. Speech and Language Processing: An Introduction to Natural Language Processing, Computational Linguistics, and Speech Recognition.
  6. Telmo Pires, Eva Schlinger, and Dan Garrette. “How multilingual is multilingual BERT?” In: arXiv preprint arXiv:1906.01502 (2019).
  7. Partha Pakray, Alexander Gelbukh, and Sivaji Bandyopadhyay. “Natural language processing applications for low-resource languages”. In: Natural Language Processing 31.2 (2025), pp. 183–197.
  8. Amran Salleh et al. “A hybrid model for low-resource language text classification and comparative analysis”. In: Knowledge-Based Systems (2025), p. 114068.
  9. Abhi Ram Reddy Salammagari and Gaurava Srivastava. “Advancing Natural Language Understanding for Low-Resource Languages: Current Progress, Applications, and Challenges”. In: International Journal of Advanced Research in Engineering and Technology 15 (2024), pp. 244–255.
Principles of Hybrid Intelligent Systems Principles of Hybrid Intelligent Systems