The IRMA Community
Newsletters
Research IRM
Click a keyword to search titles using our InfoSci-OnDemand powered search:
|
Deep Transfer Learning Based on LSTM Model for Reservoir Flood Forecasting
|
Author(s): Qiliang Zhu (North China University of Water Resources and Electric Power, China), Changsheng Wang (Water Conservancy and Irrigation District Engineering Construction Administration of Xixiayuan, China), Wenchao Jin (Water Conservancy and Irrigation District Engineering Construction Administration of Xixiayuan, China), Jianxun Ren (Water Resources Information Center of Henan Province, China)and Xueting Yu (North China University of Water Resources and Electric Power, China)
Copyright: 2024
Volume: 20
Issue: 1
Pages: 17
Source title:
International Journal of Data Warehousing and Mining (IJDWM)
Editor(s)-in-Chief: Eric Pardede (La Trobe University, Australia)and Kiki Adhinugraha (La Trobe University, Australia)
DOI: 10.4018/IJDWM.338912
Purchase
|
Abstract
In recent years, deep learning has been widely used as an efficient prediction algorithm. However, this algorithm has strict requirements on the size of training samples. If there are not enough samples to train the network, it is difficult to achieve the desired effect. In view of the lack of training samples, this article proposes a deep learning prediction model integrating migration learning and applies it to flood forecasting. The model uses random forest algorithm to extract the flood characteristics, and then uses the transfer learning strategy to fine-tune the parameters of the model based on the model trained with similar reservoir data; and is used for the target reservoir flood prediction. Based on the calculation results, an autoregressive algorithm is used to intelligently correct the error of the prediction results. A series of experimental results show that our proposed method is significantly superior to other classical methods in prediction accuracy.
Related Content
Feiqi Liu, Dong Yang, Yuyang Zhang, Chengcai Yang, Jingjing Yang.
© 2024.
19 pages.
|
Qiliang Zhu, Changsheng Wang, Wenchao Jin, Jianxun Ren, Xueting Yu.
© 2024.
17 pages.
|
JianDong He.
© 2024.
14 pages.
|
.
© 2024.
|
.
© 2024.
|
.
© 2024.
|
Man Jiang, Qilong Han, Haitao Zhang, Hexiang Liu.
© 2023.
15 pages.
|
|
|