IRMA-International.org: Creator of Knowledge
Information Resources Management Association
Advancing the Concepts & Practices of Information Resources Management in Modern Organizations

Adaptive English Translation Parameter Tuning via Particle Swarm Optimization and Attention Mechanism

Adaptive English Translation Parameter Tuning via Particle Swarm Optimization and Attention Mechanism
View Sample PDF
Author(s): Zhigao Wei (Xinxiang University, China)
Copyright: 2025
Volume: 16
Issue: 1
Pages: 17
Source title: International Journal of Swarm Intelligence Research (IJSIR)
Editor(s)-in-Chief: Yuhui Shi (Southern University of Science and Technology (SUSTech), China)
DOI: 10.4018/IJSIR.372900

Purchase

View Adaptive English Translation Parameter Tuning via Particle Swarm Optimization and Attention Mechanism on the publisher's website for pricing and purchasing information.

Abstract

This paper proposes an English Translation Model Parameter Adaptive Tuning Method Integrating Particle Swarm Optimization (PSO) and Attention Mechanism. We encode key Neural Machine Translation (NMT) hyperparameters as particles and employ a specialized fitness function that rewards both conventional translation metrics and attention alignment quality. Furthermore, a dynamic feedback multiplier modulates PSO velocity updates in real time, incentivizing hyperparameter adjustments when attention scores deviate from predefined thresholds. Experimental evaluations on both IWSLT En–De and WMT En–De datasets demonstrate the effectiveness of this method, yielding improvements in BLEU and ROUGE-L scores relative to standard baselines. Ablation studies further confirm that both the attention alignment term in the fitness function and the dynamic attention feedback are indispensable for achieving superior translation quality and more coherent attention patterns.

Related Content

. © 2025.
Zhigao Wei. © 2025. 17 pages.
. © 2025.
Weijun Ye, Bingyang Wang. © 2025. 16 pages.
Shaolong Han, Shangrong Wang, Wenqi Liu, YongQiang Gu, Yujie Zhang. © 2025. 20 pages.
Guang Yang. © 2025. 24 pages.
Ying Zhao. © 2025. 23 pages.
Body Bottom