The IRMA Community
Newsletters
Research IRM
Click a keyword to search titles using our InfoSci-OnDemand powered search:
|
Global Stability Analysis for Complex-Valued Recurrent Neural Networks and Its Application to Convex Optimization Problems
Abstract
Global stability analysis for complex-valued artificial recurrent neural networks seems to be one of yet-unchallenged topics in information science. This chapter presents global stability conditions for discrete-time and continuous- time complex-valued recurrent neural networks, which are regarded as nonlinear dynamical systems. Global asymptotic stability conditions for these networks are derived by way of suitable choices of activation functions. According to these stability conditions, there are classes of discrete-time and continuous-time complex-valued recurrent neural networks whose equilibrium point is globally asymptotically stable. Furthermore, the conditions are shown to be successfully applicable to solving convex programming problems, for which real field solution methods are generally tedious.
Related Content
Dankan Gowda V., Anjali Sandeep Gaikwad, Pilli Lalitha Kumari, Erdal Buyukbicakci, Sengul Ibrahimoglu.
© 2025.
32 pages.
|
Debasish Banerjee, Ranjit Barua, Sudipto Datta, Dileep Pathote.
© 2025.
18 pages.
|
Kok Yeow You, Man Seng Sim.
© 2025.
96 pages.
|
Man Seng Sim, Kok Yeow You, Fahmiruddin Esa, Raimi Dewan, DiviyaDevi Paramasivam, Rozeha A. Rashid.
© 2025.
38 pages.
|
Mandeep Kaur.
© 2025.
24 pages.
|
Ganesh Khekare, Priya Dasarwar, Ajay Kumar Phulre, Urvashi Khekare, Gaurav Kumar Ameta, Shashi Kant Gupta.
© 2025.
22 pages.
|
Manoj Kumar Elipey, P. S. Kishore, Ratna Sunil Buradagunta.
© 2025.
14 pages.
|
|
|