What is it about?

Entropy measures are effective features for time series classification problems. Traditional entropy measures, such as Shannon entropy, use probability distribution function. However, for the effective separation of time series, new entropy estimation methods are required to characterize the chaotic dynamic of the system. Our concept of Neural Network Entropy (NNetEn) is based on the classification of special datasets in relation to the entropy of the time series recorded in the reservoir of the neural network. NNetEn estimates the chaotic dynamics of time series in an original way and does not take into account probability distribution functions.

Featured Image

Why is it important?

In practice, it is important to understand what entropies to use for signal separation and classification. However, each case requires its own parameters and types of entropies. NNetEn is the first entropy that does not use probability distribution function and provides useful information about the signal in the classification problems. For each individual task, a specific entropy measure must be tested, as it is difficult to predict how effective the measure will be in the given settings. Based on the examples presented, entropy depends in a complex way on the calculation parameters, and it is possible to extract useful information from a change in entropy. It is common for signals with similar dynamics to have different entropy correlations based on their parameters. Implementing NNetEn in Python will allow the scientific community to apply the algorithm to identify a class of problems that are effectively solved by NNetEn. The review of previous studies demonstrates the NNetEn's success in practical problems. Moreover, NNetEn is illustrated with a practical example. In terms of feature strength, NNetEn ranks among the leaders for the classification of EEG signals from certain channels.

Perspectives

Parallelizing the NNetEn algorithm, applying the new metric on different practical problems and investigating the effects on new metrics on different classification algorithms can be considered as future research directions.

Dr. Andrei Velichko
Petrozavodsk State University

Read the Original

This page is a summary of: Neural Network Entropy (NNetEn): Entropy-Based EEG Signal and Chaotic Time Series Classification, Python Package for NNetEn Calculation, Algorithms, May 2023, MDPI AG,
DOI: 10.3390/a16050255.
You can read the full text:

Read

Contributors

The following have contributed to this page