Research Article

Demystification of Artificial Intelligence Systems in Linguistic Intelligence and English Language Domains

Authors

  • Maitham Sarhan H. Alhamad Wasit Directorate of Education, Iraq

Abstract

The term "artificial intelligence" was first coined by John McCarthy as "the science and engineering of making intelligent machines" in a document created for a conference on the campus of Dartmouth College in 1956. This conference kickstarted the beginning of serious AI research in the upcoming decades. The concept of Artificial Intelligence is not as modern as we think it is. This traces back to as early as 1950 when Alan Turing invented the Turing Test. Then, the first chat-box computer programmer, Eliza, was created in the 1960s. Indeed, in 2017, 61% of Europeans were positive about robotics and AI, while 30% were negative, to be managed carefully. The dynamics of public opposition and acceptance could be important factors shaping AI's long-term development path. The theoretical framework is that artificial intelligence could be viewed as an "overarching rubric which encompasses machine learning, which further encompasses deep learning." Rich and Knight (1991, p. 3) stated that "artificial intelligence (AI) is the study of how to make computers do things which, at the moment, people do better."

Article information

Journal

International Journal of Linguistics, Literature and Translation

Volume (Issue)

8 (5)

Pages

622-633

Published

2025-05-24

How to Cite

Maitham Sarhan H. Alhamad. (2025). Demystification of Artificial Intelligence Systems in Linguistic Intelligence and English Language Domains. International Journal of Linguistics, Literature and Translation, 8(5), 622-633. https://doi.org/10.32996/ijllt.2025.8.5.16

Downloads

Views

63

Downloads

45

Keywords:

Artificial intelligence, linguistic intelligence, computer programmes, computational linguistics, complex biological system, mindfulness ambiguities in AI