The IRMA Community
Newsletters
Research IRM
Click a keyword to search titles using our InfoSci-OnDemand powered search:
|
Adaptive English Translation Parameter Tuning via Particle Swarm Optimization and Attention Mechanism
Abstract
This paper proposes an English Translation Model Parameter Adaptive Tuning Method Integrating Particle Swarm Optimization (PSO) and Attention Mechanism. We encode key Neural Machine Translation (NMT) hyperparameters as particles and employ a specialized fitness function that rewards both conventional translation metrics and attention alignment quality. Furthermore, a dynamic feedback multiplier modulates PSO velocity updates in real time, incentivizing hyperparameter adjustments when attention scores deviate from predefined thresholds. Experimental evaluations on both IWSLT En–De and WMT En–De datasets demonstrate the effectiveness of this method, yielding improvements in BLEU and ROUGE-L scores relative to standard baselines. Ablation studies further confirm that both the attention alignment term in the fitness function and the dynamic attention feedback are indispensable for achieving superior translation quality and more coherent attention patterns.
Related Content
.
© 2025.
|
Zhigao Wei.
© 2025.
17 pages.
|
.
© 2025.
|
Weijun Ye, Bingyang Wang.
© 2025.
16 pages.
|
Shaolong Han, Shangrong Wang, Wenqi Liu, YongQiang Gu, Yujie Zhang.
© 2025.
20 pages.
|
Guang Yang.
© 2025.
24 pages.
|
Ying Zhao.
© 2025.
23 pages.
|
|
|