1 / 25

Low-Dimensional Chaotic Signal Characterization Using Approximate Entropy

Low-Dimensional Chaotic Signal Characterization Using Approximate Entropy. Soundararajan Ezekiel Matthew Lang Computer Science Department Indiana University of Pennsylvania. Roadmap. Overview Introduction Basics and Background Methodology Experimental Results Conclusion. Overview.

albert
Télécharger la présentation

Low-Dimensional Chaotic Signal Characterization Using Approximate Entropy

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Low-Dimensional Chaotic Signal Characterization Using Approximate Entropy Soundararajan Ezekiel Matthew Lang Computer Science Department Indiana University of Pennsylvania

  2. Roadmap • Overview • Introduction • Basics and Background • Methodology • Experimental Results • Conclusion

  3. Overview • Many signals appear to be random • May be chaotic or fractal in nature • Wary of noisy systems • Analysis of chaotic properties is in order • Our method - approximate entropy

  4. Introduction • Chaotic behavior is a lack of periodicity • Historically, non-periodicity implied randomness • Today, we know this behavior may be chaotic or fractal in nature • Power of fractal and chaos analysis

  5. Introduction • Chaotic systems have four essential characteristics: • deterministic system • sensitive to initial conditions • unpredictable behavior • values depend on attractors

  6. Introduction • Attractor's dimension is useful and good starting point • Even an incomplete description is useful

  7. Basics and Background • Fractal analysis • Fractal dimension defined for set whose Hausdorff-Besicovitch dimension exceeds its topological dimensions. • Also can be described by self-similarity property • Goal: find self-similar features and characterize data set

  8. Basics and Background • Chaotic analysis • Output of system mimics random behavior • Goal: determine mathematical form of process • Performed by transforming data to a phase space

  9. Basics and Background • Definitions • Phase Space: n dimensional space, n is number of dynamical variables • Attractor: finite set formed by values of variables • Strange Attractors: an attractor that is fractal in nature

  10. Basics and Background • Analysis of phase space • Determine topological properties • visual analysis • capacity, correlation, information dimension • approximate entropy • Lyapunov exponents

  11. Basics and Background • Fractal dimension of the attractor • Related to number of independent variables needed to generate time series • number of independent variables is smallest integer greater than fractal dimension of attractor

  12. Basics and Background • Box Dimension • Estimator for fractal dimension • Measure of the geometric aspect of the signal on the attractor • Count of boxes covering attractor

  13. Basics and Background • Information dimension • Similar to box dimension • Accounts for frequency of visitation • Based on point weighting - measures rate of change of information content

  14. Methodology • Approximate Entropy is based on information dimension • Embedded in lower dimensions • Computation is similar to that of correlation dimension

  15. Algorithm • Given a signal {Si}, calculate the approximate entropy for {Si} by the following steps. Note that the approximate entropy may be calculated for the entire signal, or the entropy spectrum may be calculated for windows {Wi} on {Si}. If the entropy of the entire signal is being calculated consider {Wi} = {Si}.

  16. Algorithm • Step 1: Truncate the peaks of {Wi}. During the digitization of analog signals, some unnecessary values may be generated by the monitoring equipment. • Step 2: Calculate the mean and standard deviation (Sd) for {Wi} and compute the tolerance limit R equal to 0.3 * Sd to reduces the noise effect.

  17. Algorithm • Step 3: Construct the phase space by plotting {Wi} vs. {Wi+τ}, where τ is the time lag, in an E = 2 space. • Step 4: Calculate the Euclidean distance Di between each pair of points in the phase space. Count Ci(R) the number of pairs in which Di<R, for each i.

  18. Algorithm • Step 5: Calculate the mean of Ci(R) then the log (mean) is the approximate entropy Apn(E) for Euclidean dimension E = 2. • Step 6: Repeat Steps 2-5 for E = 3. • Step 7: The approximate entropy for {Wi} is calculated as Apn(2) - Apn(3).

  19. Noise

  20. HRV (young subject)

  21. HRV (older subject)

  22. Stock Signal

  23. Seismic Signal

  24. Seismic Signal

  25. Conclusion • High approximate entropy - randomness • Low approximate entropy - periodic • Approximate entropy can be used to evaluate the predictability of a signal • Low predictability - random

More Related