Optimization Problems Arising in Stability Analysis of Discrete Time Recurrent Neural Networks

dc.contributor.authorSingh, Jayant
dc.date.accessioned2016-01-22T21:13:46Z
dc.date.available2016-01-22T21:13:46Z
dc.date.issued2016
dc.description.abstractWe consider the method of Reduction of Dissipativity Domain to prove global Lyapunov stability of Discrete Time Recurrent Neural Networks. The standard and advanced criteria for Absolute Stability of these essentially nonlinear systems produce rather weak results. The method mentioned above is proved to be more powerful. It involves a multi-step procedure with maximization of special nonconvex functions over polytopes on every step. We derive conditions which guarantee an existence of at most one point of local maximum for such functions over every hyperplane. This nontrivial result is valid for wide range of neuron transfer functions.en_US
dc.identifier.urihttps://hdl.handle.net/10365/25537
dc.publisherNorth Dakota State Universityen_US
dc.rightsNDSU Policy 190.6.2
dc.rights.urihttps://www.ndsu.edu/fileadmin/policy/190.pdf
dc.titleOptimization Problems Arising in Stability Analysis of Discrete Time Recurrent Neural Networksen_US
dc.typeDissertationen_US
dc.typeVideoen_US
ndsu.advisorBarabanov, Nikita
ndsu.collegeScience and Mathematicsen_US
ndsu.degreeDoctor of Philosophy (PhD)en_US
ndsu.departmentMathematicsen_US
ndsu.programMathematicsen_US

Files

Original bundle

Now showing 1 - 2 of 2
No Thumbnail Available
Name:
Jayant Singh video.mov
Size:
96.76 MB
Format:
Video Quicktime
No Thumbnail Available
Name:
Optimization Problems Arising in Stability Analysis of Discrete Time Recurrent Neural Networks.pdf
Size:
542.55 KB
Format:
Adobe Portable Document Format
Description:

License bundle

Now showing 1 - 1 of 1
No Thumbnail Available
Name:
license.txt
Size:
1.71 KB
Format:
Item-specific license agreed to upon submission
Description: