EntroPy is a Python 3 package providing several time-efficient algorithms for computing the complexity of one-dimensional time series. It can be used for example to extract features from EEG signals.
Develop mode
git clone https://github.com/raphaelvallat/entropy.git entropy/
cd entropy/
pip install -r requirements.txt
python setup.py develop
Dependencies
- numpy
- scipy
- scikit-learn
- numba
from entropy import *
import numpy as np
np.random.seed(1234567)
x = np.random.rand(3000)
print(perm_entropy(x, order=3, normalize=True)) # Permutation entropy
print(spectral_entropy(x, 100, method='welch', normalize=True)) # Spectral entropy
print(svd_entropy(x, order=3, delay=1, normalize=True)) # Singular value decomposition entropy
print(app_entropy(x, order=2, metric='chebyshev')) # Approximate entropy
print(sample_entropy(x, order=2, metric='chebyshev')) # Sample entropy
0.9995858289645746 0.9945519071575192 0.8482185855709181 2.0754913760787277 2.192416747827227
print(petrosian_fd(x)) # Petrosian fractal dimension
print(katz_fd(x)) # Katz fractal dimension
print(higuchi_fd(x, kmax=10)) # Higuchi fractal dimension
1.0303256054255618 9.496389529050981 1.9914197968462963
print(detrended_fluctuation(x)) # Detrended fluctuation analysis
0.5082304865081877
Here are some benchmarks computed on an average PC (i7-7700HQ CPU @ 2.80 Ghz - 8 Go of RAM).
from entropy import *
import numpy as np
np.random.seed(1234567)
x = np.random.rand(1000)
# Entropy
%timeit perm_entropy(x, order=3, delay=1)
%timeit spectral_entropy(x, 100, method='fft')
%timeit svd_entropy(x, order=3, delay=1)
%timeit app_entropy(x, order=2) # Slow
%timeit sample_entropy(x, order=2) # Numba
# Fractal dimension
%timeit petrosian_fd(x)
%timeit katz_fd(x)
%timeit higuchi_fd(x) # Numba
# Other
%timeit detrended_fluctuation(x) # Numba
127 µs ± 3.86 µs per loop (mean ± std. dev. of 7 runs, 10000 loops each) 150 µs ± 859 ns per loop (mean ± std. dev. of 7 runs, 10000 loops each) 42.4 µs ± 306 ns per loop (mean ± std. dev. of 7 runs, 10000 loops each) 4.59 ms ± 62.2 µs per loop (mean ± std. dev. of 7 runs, 100 loops each) 2.03 ms ± 39.5 µs per loop (mean ± std. dev. of 7 runs, 1000 loops each) 16.4 µs ± 251 ns per loop (mean ± std. dev. of 7 runs, 100000 loops each) 32.4 µs ± 578 ns per loop (mean ± std. dev. of 7 runs, 10000 loops each) 17.4 µs ± 274 ns per loop (mean ± std. dev. of 7 runs, 100000 loops each) 755 µs ± 17.1 µs per loop (mean ± std. dev. of 7 runs, 1000 loops each)
EntroPy was created and is maintained by Raphael Vallat. Contributions are more than welcome so feel free to contact me, open an issue or submit a pull request!
To see the code or report a bug, please visit the GitHub repository.
Note that this program is provided with NO WARRANTY OF ANY KIND. If you can, always double check the results.
Several functions of EntroPy were borrowed from:
- MNE-features: https://github.com/mne-tools/mne-features
- pyEntropy: https://github.com/nikdon/pyEntropy
- pyrem: https://github.com/gilestrolab/pyrem
- nolds: https://github.com/CSchoel/nolds
All the credit goes to the author of these excellent packages.