Comments (4)
is it good?
https://github.com/jundongl/scikit-feature/blob/master/skfeature/utility/entropy_estimators.py
from entropy_estimators.
I don't have time to vouch other people's code. There is an implementation in scikit-learn, which is what I would use to compute the MI between categorical variables.
https://scikit-learn.org/stable/modules/generated/sklearn.metrics.mutual_info_score.html
from entropy_estimators.
I see thanks for soon answer
it would be very kind of you to share some links to understand
why
https://scikit-learn.org/stable/modules/generated/sklearn.feature_selection.mutual_info_classif.html#sklearn.feature_selection.mutual_info_classif
Kozachenko, N. N. Leonenko,
is good for mutual information (they use it for feature selection)
why it impossible just calculate similarity/mutual information for each variable (feature) and anther variable (target_?
Then if similarity/mutual information for given feature and target is high then this feature is good to use??
seems to be I can not understand something conceptual about Kozachenko, N. N. Leonenko mutual information
may you share some link to simple plain python code to example for Kozachenko, N. N. Leonenko mutual information, pls
from entropy_estimators.
Sandy, you are asking me to comment on code that I haven't even read, much less written myself.
You should really head over to the statistics or signal processing stackexchange.
That being said, I think it is a terrible idea to use the Leonenko estimator for discrete data (it becomes unstable if any distances are close to zero, and for discrete variables, many distances may indeed be zero). If you want to understand how the estimator works, I would recommend the
Kraskov, H. Stogbauer and P. Grassberger, “Estimating mutual information”. Phys. Rev. E 69, 2004.
paper. It is very accessible. Both, the Leonenko estimator for entropy and the Kraskov estimator for MI are implemented in my code. So you can look up an implementation there.
from entropy_estimators.
Related Issues (14)
- continuous entropy with KNN HOT 21
- Support for computing entropy of a Tensor HOT 9
- Mutual information is greater than information entropy HOT 25
- Does "partial mutual information" here mean "conditional mutual information"?
- Does "partial mutual information" here mean "conditional mutual information"? HOT 2
- Multiplying the euclidian distance by 2. HOT 2
- Error occurred with "get_imin" HOT 1
- Can we truely get the joint distribution P(x,y) to calculate the H(x,y) ? HOT 4
- Unexpected -inf entropy estimations HOT 6
- Transfer Entropy on Different Dimensions? HOT 2
- Regarding Maximal Entropy HOT 3
- Process finished with exit code -1073741571 (0xC00000FD) HOT 14
- readme import numpy as np HOT 1
Recommend Projects
-
React
A declarative, efficient, and flexible JavaScript library for building user interfaces.
-
Vue.js
🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.
-
Typescript
TypeScript is a superset of JavaScript that compiles to clean JavaScript output.
-
TensorFlow
An Open Source Machine Learning Framework for Everyone
-
Django
The Web framework for perfectionists with deadlines.
-
Laravel
A PHP framework for web artisans
-
D3
Bring data to life with SVG, Canvas and HTML. 📊📈🎉
-
Recommend Topics
-
javascript
JavaScript (JS) is a lightweight interpreted programming language with first-class functions.
-
web
Some thing interesting about web. New door for the world.
-
server
A server is a program made to process requests and deliver data to clients.
-
Machine learning
Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.
-
Visualization
Some thing interesting about visualization, use data art
-
Game
Some thing interesting about game, make everyone happy.
Recommend Org
-
Facebook
We are working to build community through open source technology. NB: members must have two-factor auth.
-
Microsoft
Open source projects and samples from Microsoft.
-
Google
Google ❤️ Open Source for everyone.
-
Alibaba
Alibaba Open Source for everyone
-
D3
Data-Driven Documents codes.
-
Tencent
China tencent open source team.
from entropy_estimators.