access icon free Delay dependent stability conditions of static recurrent neural networks: a non-linear convex combination method

A new method is developed for stability of static recurrent neural networks with time-varying delay in this study. Improved delay-dependent conditions in the form of a set of linear matrix inequalities are derived for this class of static nets through the newly proposed augmented Lyapunov–Krasovski functional. Our derivation employs a novel non-linear convex combination technique, that is, quadratic convex combination. Different from previous results, the property of quadratic convex function is fully taken advantage of without resort to the Jensen's inequality. A numerical example is provided to verify the effectiveness and superiority of the presented results.

Inspec keywords: quadratic programming; Lyapunov methods; delays; linear matrix inequalities; recurrent neural nets; time-varying systems; convex programming; stability criteria

Other keywords: time-varying delay; augmented Lyapunov-Krasovski functional; static recurrent neural networks; nonlinear convex combination method; linear matrix inequalities; quadratic convex combination; delay-dependent stability conditions

Subjects: Optimisation techniques; Neural nets (theory); Algebra

References

    1. 1)
    2. 2)
    3. 3)
    4. 4)
    5. 5)
    6. 6)
    7. 7)
    8. 8)
    9. 9)
    10. 10)
    11. 11)
    12. 12)
    13. 13)
    14. 14)
    15. 15)
    16. 16)
    17. 17)
    18. 18)
    19. 19)
    20. 20)
    21. 21)
    22. 22)
    23. 23)
    24. 24)
    25. 25)
    26. 26)
    27. 27)
    28. 28)
    29. 29)
    30. 30)
    31. 31)
    32. 32)
    33. 33)
    34. 34)
    35. 35)
http://iet.metastore.ingenta.com/content/journals/10.1049/iet-cta.2014.0117
Loading

Related content

content/journals/10.1049/iet-cta.2014.0117
pub_keyword,iet_inspecKeyword,pub_concept
6
6
Loading