access icon free Two Entropy-Based Criteria Design for Signal Complexity Measures

Signal complexity denotes the intricate patterns hidden in the complicated dynamics merging from nonlinear system concerned. The chaotic signal complexity measuring in principle combines both the information entropy of the data under test and the geometry feature embedded. Starting from the information source of Shannon's entropy, combined with understanding the merits and demerits of 0-1 test for chaos, we propose new compression entropy criteria for identifying chaotic signal complexity in periodic, quasi-periodic or chaotic state, in mapping results in 3s-graph with significant different shape of good or bad spring and in Construction creep (CC) rate with distinguishable value-range of [0, 7%], (7%, 50%] or (50%, 84%]. The employed simulation cases are Lorenz, Li and He equations' evolutions, under key information extracting rules of both two-layer compression functions and self-similarity calcu-lation, compared with methods of 0-1 test for chaos, Lyapunov exponent and Spectral Entropy complexity. The research value of this work will provide deep thinking of the concise featureexpressions of chaotic signal complexity measure in feature domain.

Inspec keywords: signal processing; entropy; Lyapunov methods; computational complexity; chaos

Other keywords: entropy-based criteria design; information entropy; signal complexity measures; two-layer compression functions; feature domain; chaotic signal complexity measure; complicated dynamics; Lyapunov exponent; self-similarity calcu-lation; compression entropy criteria; construction creep rate; spectral entropy complexity; information extracting rules; Shannon's entropy

Subjects: Signal processing and detection; Signal processing theory; Computational complexity

http://iet.metastore.ingenta.com/content/journals/10.1049/cje.2019.07.008
Loading

Related content

content/journals/10.1049/cje.2019.07.008
pub_keyword,iet_inspecKeyword,pub_concept
6
6
Loading