THE EVOLUTION OF COMPUTER LINGUISTICS AND ARTIFICIAL INTELLIGENCE PARADIGMS IN TRANSLATION STUDIES AND THEIR LINGUISTIC FOUNDATIONS
Keywords:
computer linguistics, artificial intelligence (AI) paradigms, rule-based, statistical machine translation, neural machine translation (NMT)Abstract
This article examines the evolution of computer linguistics and artificial intelligence (AI) paradigms within the field of translation studies, contrasting traditional human-centric theories with modern technological approaches. The discussion traces the shift from rule-based and statistical machine translation (MT) systems to the advanced capabilities of neural machine translation (NMT), highlighting how these technologies have fundamentally changed the translation process. A significant portion of the analysis focuses on the challenges of applying MT and AI to the typologically distinct languages of English and Uzbek, particularly concerning morphological complexity, semantic ambiguity, and syntactic transformations. The text concludes that while AI significantly expands the epistemological boundaries of translation studies, it acts primarily as a supportive tool rather than a full replacement for human translators, whose cultural and contextual knowledge remains vital.
References
1. Nida, E. Toward a Science of Translating. – Leiden: Brill, 1964.
2. Catford, J. C. A Linguistic Theory of Translation. – Oxford: OUP, 1965.
3. Hutchins, W., Somers, H. An Introduction to Machine Translation. – London, 1992.
4. Koehn, P. Neural Machine Translation. – Cambridge: Cambridge University Press, 2020.
5. Jurafsky, D., Martin, J. Speech and Language Processing. – Stanford, 2023.
6. Baker, M. In Other Words: A Coursebook on Translation. – London: Routledge, 2018.