The IRMA Community
Newsletters
Research IRM
Click a keyword to search titles using our InfoSci-OnDemand powered search:
|
Global Stability Analysis for Complex-Valued Recurrent Neural Networks and Its Application to Convex Optimization Problems
Abstract
Global stability analysis for complex-valued artificial recurrent neural networks seems to be one of yet-unchallenged topics in information science. This chapter presents global stability conditions for discrete-time and continuous- time complex-valued recurrent neural networks, which are regarded as nonlinear dynamical systems. Global asymptotic stability conditions for these networks are derived by way of suitable choices of activation functions. According to these stability conditions, there are classes of discrete-time and continuous-time complex-valued recurrent neural networks whose equilibrium point is globally asymptotically stable. Furthermore, the conditions are shown to be successfully applicable to solving convex programming problems, for which real field solution methods are generally tedious.
Related Content
Vinod Kumar, Himanshu Prajapati, Sasikala Ponnusamy.
© 2023.
18 pages.
|
Sougatamoy Biswas.
© 2023.
14 pages.
|
Ganga Devi S. V. S..
© 2023.
10 pages.
|
Gotam Singh Lalotra, Ashok Sharma, Barun Kumar Bhatti, Suresh Singh.
© 2023.
15 pages.
|
Nimish Kumar, Himanshu Verma, Yogesh Kumar Sharma.
© 2023.
16 pages.
|
R. Soujanya, Ravi Mohan Sharma, Manish Manish Maheshwari, Divya Prakash Shrivastava.
© 2023.
12 pages.
|
Nimish Kumar, Himanshu Verma, Yogesh Kumar Sharma.
© 2023.
22 pages.
|
|
|