The IRMA Community
Newsletters
Research IRM
Click a keyword to search titles using our InfoSci-OnDemand powered search:
|
Bi2-BERT-Based Long Sequence Text Vectorization Method for Power Grid Work Orders
|
Author(s): Xiangzhao Cheng (State Grid Tai'an Power Supply Company, China), Ting Li (State Grid Tai'an Power Supply Company, China), Zhuo Zhang (State Grid Tai'an Power Supply Company, China), Yue Zheng (State Grid Tai'an Power Supply Company, China), Guanglei Hu (State Grid Tai'an Power Supply Company, China), Hongxin Li (State Grid Tai'an Power Supply Company, China)and Tongqing Zhang (State Grid Tai'an Power Supply Company, China)
Copyright: 2025
Volume: 16
Issue: 1
Pages: 23
Source title:
International Journal of Mobile Computing and Multimedia Communications (IJMCMC)
Editor(s)-in-Chief: Agustinus Waluyo (Monash University, Australia)
DOI: 10.4018/IJMCMC.368256
Purchase
|
Abstract
With the continuous development of power systems, power grid work order data provides a large amount of data support for power facilities. However, the long sequence of text in power grid work orders brings challenges to computer processing and analysis. In order to improve the accuracy of power grid work orders, this paper proposes a long sequence text vectorization method for power grid work orders based on leader grey wolf optimization-bidirectional long short-term memory and bidirectional encoder representations from transformers (Bi2-BERT). Firstly, a long sequence text feature extraction method is proposed, which performs the long sequence text feature extraction by BiLSTM, to improve the global optimization performance. Secondly, a text vectorization method for power grid work orders based on increment-stock bilevel-driven BERT is proposed to improve the model learning efficiency and solve the catastrophic forgetting problem. The results show that the proposed algorithm has high accuracy and convergence speed in long sequence text vectorization of power grid work orders.
Related Content
Xiangzhao Cheng, Ting Li, Zhuo Zhang, Yue Zheng, Guanglei Hu, Hongxin Li, Tongqing Zhang.
© 2025.
23 pages.
|
Xinli Zhu, Zhiqiang Gao, Xu An Wang.
© 2025.
14 pages.
|
Xu-Jun Jian, Chao-Hung Wang, Tieh-Cheng Fu, Shiyang Lyu, David Taniar, Tun-Wen Pai.
© 2025.
13 pages.
|
Jie Huang.
© 2025.
23 pages.
|
Yue Hu, Yanan Wang, Wei Zhao, Li Shang, Yuhang Pang, Juan Pan, Tongtong Zhang, Weiwei Dou.
© 2025.
22 pages.
|
Badreya Al-jenaibi (536d4bda-1d8b-42f0-94b7-3346c14bc901.
© 2024.
24 pages.
|
Wanqiao Wang, Jian Su, Hui Zhang, Luyao Guan, Qingrong Zheng, Zhuofan Tang, Huixia Ding.
© 2024.
16 pages.
|
|
|