The IRMA Community
Newsletters
Research IRM
Click a keyword to search titles using our InfoSci-OnDemand powered search:
|
A Perturbation Size-Independent Analysis of Robustness in Neural Networks by Randomized Algorithms
Abstract
This chapter presents a general methodology for evaluating the loss in performance of a generic neural network once its weights are affected by perturbations. Since weights represent the “knowledge space” of the neural model, the robustness analysis can be used to study the weights/performance relationship. The perturbation analysis, which is closely related to sensitivity issues, relaxes all assumptions made in the related literature, such as the small perturbation hypothesis, specific requirements on the distribution of perturbations and neural variables, the number of hidden units and a given neural structure. The methodology, based on Randomized Algorithms, allows reformulating the computationally intractable problem of robustness/sensitivity analysis in a probabilistic framework characterised by a polynomial time solution in the accuracy and confidence degrees.
Related Content
Bhargav Naidu Matcha, Sivakumar Sivanesan, K. C. Ng, Se Yong Eh Noum, Aman Sharma.
© 2023.
60 pages.
|
Lavanya Sendhilvel, Kush Diwakar Desai, Simran Adake, Rachit Bisaria, Hemang Ghanshyambhai Vekariya.
© 2023.
15 pages.
|
Jayanthi Ganapathy, Purushothaman R., Ramya M., Joselyn Diana C..
© 2023.
14 pages.
|
Prince Rajak, Anjali Sagar Jangde, Govind P. Gupta.
© 2023.
14 pages.
|
Mustafa Eren Akpınar.
© 2023.
9 pages.
|
Sreekantha Desai Karanam, Krithin M., R. V. Kulkarni.
© 2023.
34 pages.
|
Omprakash Nayak, Tejaswini Pallapothala, Govind P. Gupta.
© 2023.
19 pages.
|
|
|