The IRMA Community
Newsletters
Research IRM
Click a keyword to search titles using our InfoSci-OnDemand powered search:
|
Unraveling the Inner Workings of Massive Language Models: Architecture, Training, and Linguistic Capacities
|
Author(s): C. V. Suresh Babu (Hindustan Institute of Technology and Science, India), C. S. Akkash Anniyappa (Sri Sivasubramaniya Nadar College of Engineering, India)and Dharma Sastha B. (Hindustan Institute of Technology and Science, India)
Copyright: 2024
Pages: 41
Source title:
Challenges in Large Language Model Development and AI Ethics
Source Author(s)/Editor(s): Brij Gupta (Asia University, Taichung City, Taiwan)
DOI: 10.4018/979-8-3693-3860-5.ch008
Purchase
|
Abstract
This study explores the evolution of language models, emphasizing the shift from traditional statistical methods to advanced neural networks, particularly the transformer architecture. It aims to understand the impact of these advancements on natural language processing (NLP). The study examines the core concepts of language models, including neural networks, attention, and self-attention mechanisms, and evaluates their performance on various NLP tasks. The findings demonstrate significant improvements in language modeling, especially in dialogue generation and translation. Despite these advancements, the study highlights the need to address ethical issues such as bias, fairness, privacy, and security for responsible AI deployment.
Related Content
Sreerakuvandana Sreerakuvandana, Princy Pappachan, Varsha Arya.
© 2024.
24 pages.
|
Sandfreni, Ritika Bansal.
© 2024.
57 pages.
|
Ankita Manohar Walawalkar, Massoud Moslehpour, Thanaporn Phattanaviroj, Suman Kumar.
© 2024.
33 pages.
|
Akshat Gaurav, Brij B. Gupta, Arcangelo Castiglione.
© 2024.
30 pages.
|
Gerry Firmansyah, Shavi Bansal, Ankita Manohar Walawalkar, Suman Kumar, Sourasis Chattopadhyay.
© 2024.
33 pages.
|
Princy Pappachan, Massoud Moslehpour, Ritika Bansal, Mosiur Rahaman.
© 2024.
34 pages.
|
Akshat Gaurav, Brij B. Gupta, Jinsong Wu, Priyanka Chaurasia.
© 2024.
27 pages.
|
|
|