Giter Club home page Giter Club logo

entropy's Introduction

entropy's People

Contributors

pranaysy avatar raphaelvallat avatar

Stargazers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

Watchers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

entropy's Issues

module 'entropy' has no attribute error for all entropy/complexity tests

A bizzare issue that has come up after formatting my pc and reinstalling entropy. I am not sure if I'm making a mistake or the newest version of entropy broke something.

This is the code I am using for a two-channel resting-state EEG study.

`
import yasa
import mne
import numpy as np
import seaborn as sns
import pandas as pd
import entropy as ent
sns.set(font_scale=1.2)

#P1

Load data as a eeglab file

f = mne.io.read_raw_eeglab('D:/study/EEGLAB/resting state/Recordings/bandpass 0point5 - 45Hz +ASR/1_EEG_LR_RS.set', eog=(), preload=True, uint16_codec=None, verbose=None)

Load data

data = f._data * 1e6

sf = f.info['sfreq']
chan = f.ch_names
times = np.arange(data.shape[1]) / sf

print(data.shape, np.round(data[0:5], 3))

Convert the EEG data to 30-sec data

times, data_win = yasa.sliding_window(data, sf, window=51)

Convert times to minutes

times /= 60

from numpy import apply_along_axis as apply

def lziv(x):
"""Binarize the EEG signal and calculate the Lempel-Ziv complexity.
"""
return ent.lziv_complexity(x > x.mean(), normalize=True)

df_all = []

for i, c in enumerate(chan):
data_win_ep = data_win[i, :, :]
# Calculate entropy for this channel
df_chan = {
# Entropy
'perm_entropy': apply(ent.perm_entropy, axis=1, arr=data_win_ep, normalize=True),
'svd_entropy': apply(ent.svd_entropy, 1, data_win_ep, normalize=True),
'spec_entropy': apply(ent.spectral_entropy, 1, data_win_ep, sf=sf, nperseg=data_win_ep.shape[1], normalize=True),
'sample_entropy': apply(ent.sample_entropy, 1, data_win_ep),
# Fractal dimension
'dfa': apply(ent.detrended_fluctuation, 1, data_win_ep),
'petrosian': apply(ent.petrosian_fd, 1, data_win_ep),
'katz': apply(ent.katz_fd, 1, data_win_ep),
'higuchi': apply(ent.higuchi_fd, 1, data_win_ep),
'lziv': apply(lziv, 1, data_win_ep),
}

df_chan['Channel'] = f
# Append to a larger dataframe
df_all.append(df_chan)

df_all = pd.DataFrame(df_all)
df_all.to_csv(r'D:\study\EEGLAB\resting state\nonlinear - ASR + BANDPASS\1_nonlinear.csv')`

Error: AttributeError: module 'entropy' has no attribute 'perm_entropy'. This happens with all different types of entropy/complexity is just stops at the first one.

Thank you for your help in the matter.

Empty package

Hey There,

I just cloned the repository and installed the entropy package, but it seems to be empty:

In [1]: import entropy                                                          

In [2]: dir(entropy)                                                            
Out[2]: ['__doc__', '__loader__', '__name__', '__package__', '__path__', '__spec__']

if I try to import any specific function I get the error:

In [3]: from entropy import spectral_entropy                                    
---------------------------------------------------------------------------
ImportError                               Traceback (most recent call last)
<ipython-input-3-06a7b4be4b3b> in <module>
----> 1 from entropy import spectral_entropy

ImportError: cannot import name 'spectral_entropy'

I'm running python 3.6 from the python 3.6-slim official docker.

Removing a vector for approximate entropy

Hi @raphaelvallat I have been finding your packages and your guides extremely helpful!

I'm currently working on NeuroKit with @DominiqueMakowski and we are looking at implementing the functions for different entropy. I have a small question regarding your implementation below of ApEn:

def _app_samp_entropy(x, order, metric='chebyshev', approximate=True):

if approximate:
        emb_data1 = _emb_data1
    else:
        emb_data1 = _emb_data1[:-1]

It seems like here the last vector in the embedded time-series is removed if approximate is False (sample entropy). However, I couldn't find the rationale for this particular removal. Would really appreciate it if you could point me to the right direction.

Many thanks!
Tam

Entropy module has no attribute 'perm_entropy'

When trying to run the Sleep Staging prediction code below, on my own .EDF dataset, I get the same entropy error that the module 'entropy' has no attribute 'perm_entropy'. I was able to import my dataset and print out summary statistics for my dataset, but am getting stuck here. Please let me know what troubleshooting I can try. Thanks!

`sls = yasa.SleepStaging(raw, eeg_name="LEEG3_Ch12", eog_name="LEOG_Ch2", emg_name="LEMG_Ch5", metadata=dict(age=1.8, male=False))

Getting the predicted sleep stages is now as easy as:

y_pred = sls.predict()`

AttributeError: module 'entropy' has no attribute 'perm_entropy'

NameError: name 'unicode_type' is not defined

Greetings Previously every things was working fine, two weeks ago.
But now i am getting this error.
NameError: name 'unicode_type' is not defined

I am using google colab
You can reproduce the problem by

!pip install git+https://github.com/raphaelvallat/entropy.git
from entropy import *

Thanks

Publish in PyPi with different name

Hi,
Thanks for the package, it's awesome !
Why don't you want to publish it on PyPi under a different name ? You'd get way more users, starting by me. Handling packages manually is not practical by any means. Your code is way harder to use in production than it would be on PyPi.

That's just a suggestion, great work on the package !

Sample entropy returns inf

Hi,
Somehow when I try to calculate the sample entropy of a series it returns me inf.
I couldn't find anything about this in the documentation.

this is a dataset which leads to inf sample entropy with Chebyshev distance and order= 2:

[0, 4, 3, 1, 0, 4, 1, 9, 6, 0, 4, 6, 4, 3, 8, 8, 3, 5, 6, 6, 1, 6, 5, 2, 0, 6, 7, 6, 5, 8, 5, 2, 5, 5, 0, 0, 8, 5, 8, 9, 3, 5, 3, 6, 8, 4, 6, 5, 4, 5, 7]

Might this be because the dataset is too small ?

Can't not run the code

I got error when import entropy, No module named 'llvmlite.llvmpy.ee', can you tell me how to solver it?

Inaccuracies in LZ complexity estimates

Hi Raphael!

I'm a big fan of your work and I am a regular user of your packages entropy, pingouin as well as yasa. These are great contributions, and I thank you for them :)

I've been relying on your Numba implementation of lempel_ziv_complexity for various analyses I'm working on. I found myself in need for a faster implementation and comments in your code pointed me to Naereen's Cython implementation.

I briefly compared the timings and accuracy of your Numba version with both the pure Python and Cython versions by Naereen. With string inputs, somehow the Numba version is slower than pure Python, both of which are slower than the Cython version. But more importantly, the estimates are different. The pure Python and Cython versions return the exact same estimates, but the Numba version returns different estimates, which are usually smaller.

I tried all three methods on 1000 randomly generated strings each for lengths [10, 100, 1000, 10000] Here's a quick visual summary of the timings, raw estimates and differences between the 3. The difference in estimates grows with string length. :(

test

I tried out a few examples by hand too. For example:

  • the string rpfqvnradv is decomposed by LZ76 to ['r', 'p', 'f', 'q', 'v', 'n', 'ra', 'd'] or 8 unique substrings
    • Both the Cython and pure Python versions return 8
    • The Numba version returns 9
  • the string msduldaefd is decomposed by LZ76 to ['m', 's', 'd', 'u', 'l', 'da', 'e', 'f'] or 8 unique substrings
    • Both the Cython and pure Python versions return 8
    • The Numba version returns 9

I haven't spent any time troubleshooting the Numba code beyond this. If I get time to go through your implementation, I'll see if there's anything I can do to help or address this issue.


Here are versions of packages I've used on Python 3.8.5:

  • Numba - 0.51.2
  • Numpy - 1.19.2
  • EntroPy - 0.1.2
  • Cython - 0.29.21

System: 10th Gen Intel Core i5-1035G4 running PopOS 20.10, kernel 5.8; gcc 10.2.0; llvmlite 0.34.0

This package is now deprecated!

Because of a PyPi conflict that prevented using pip to install this project (see #11), I have decided to duplicate this repository with a different name: AntroPy (https://github.com/raphaelvallat/antropy). I recommend all users to switch to the new package, which can be installed with pip install antropy. The current EntroPy package will be progressively deprecated over the next few weeks / months.

Spectral entropy not stable

Hello Raphael,

I analyzed the formula for the spectral entropy.

You provide 2 methods: periodogram and welch

However, periodogram is unstable and leads sometimes to Nan/Infinity due to the log2 computation.
In order to improve this formula, I would advice you to:

  1. add a note for the use of periodogram
  2. add a default value to 1 for the sampling frequency

Best, -MH

Joint Entropy?

Many thanks for providing these great entropy estimators as open source software.

Is there a way to calculate joint entropy using these entropy approximations? The ability to calculate join entropies would allow the calculation of additional information measures such as mutual information and conditional mutual information.

The entropy_estimators package, which is estimating entropy using a k-nearest-neighbor approach, calculates the joint entropy of X, Y by concatenating their values with np.c_[x, y]. This doesn't apply here because the inputs are all coerced into a 0-dimensional vector, and this approach likely doesn't make sense outside of the k-nearest neighbor approach. Is there an alternate way to think about joint entropies with respect to these estimators? In particular, I'm working with the approximate, sample, and permutation entropies.

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. 📊📈🎉

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google ❤️ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.