Show simple item record

dc.contributor.authorSingh, Jayant
dc.description.abstractWe consider the method of Reduction of Dissipativity Domain to prove global Lyapunov stability of Discrete Time Recurrent Neural Networks. The standard and advanced criteria for Absolute Stability of these essentially nonlinear systems produce rather weak results. The method mentioned above is proved to be more powerful. It involves a multi-step procedure with maximization of special nonconvex functions over polytopes on every step. We derive conditions which guarantee an existence of at most one point of local maximum for such functions over every hyperplane. This nontrivial result is valid for wide range of neuron transfer functions.en_US
dc.publisherNorth Dakota State Universityen_US
dc.rightsNDSU Policy 190.6.2
dc.titleOptimization Problems Arising in Stability Analysis of Discrete Time Recurrent Neural Networksen_US
dc.typeDissertationen_US
dc.typeVideoen_US
dc.date.accessioned2016-01-22T21:13:46Z
dc.date.available2016-01-22T21:13:46Z
dc.date.issued2016
dc.identifier.urihttp://hdl.handle.net/10365/25537
dc.rights.urihttps://www.ndsu.edu/fileadmin/policy/190.pdf
ndsu.degreeDoctor of Philosophy (PhD)en_US
ndsu.collegeScience and Mathematicsen_US
ndsu.departmentMathematicsen_US
ndsu.programMathematicsen_US
ndsu.advisorBarabanov, Nikita


Files in this item

Thumbnail
Thumbnail

This item appears in the following Collection(s)

Show simple item record