Conditional mutual information in pyagrum. #25
-
Suppose I have a pandas dataframe that is a discrete dataset, how can I go about using the function |
Beta Was this translation helpful? Give feedback.
Replies: 6 comments 13 replies
-
Hi @kenneth-lee-ch , https://pyagrum.readthedocs.io/en/1.13.1/BNLearning.html#pyAgrum.BNLearner.mutualInformation PS- oups, the description of "returns" is not correct ... |
Beta Was this translation helpful? Give feedback.
-
But how come it is not returning the actual mutual information instead of G
statistic (I am guessing its some sorts of CI tests)?
…On Sat, May 4, 2024, 5:03 AM Pierre-Henri Wuillemin < ***@***.***> wrote:
Hi @kenneth-lee-ch <https://github.com/kenneth-lee-ch> ,
when dealing with data, a good idead is to look at BNLearner :
https://pyagrum.readthedocs.io/en/1.13.1/BNLearning.html#pyAgrum.BNLearner.mutualInformation
PS- oups, the description of "returns" is not correct ...
—
Reply to this email directly, view it on GitHub
<#25 (comment)>,
or unsubscribe
<https://github.com/notifications/unsubscribe-auth/AF3Z3CUOS647PRSNCV6NWETZASP5RAVCNFSM6AAAAABHF7ZMB6VHI2DSMVQWIX3LMV43SRDJONRXK43TNFXW4Q3PNVWWK3TUHM4TGMJSGU2DM>
.
You are receiving this because you were mentioned.Message ID:
***@***.***>
|
Beta Was this translation helpful? Give feedback.
-
It does return the actual (log2) mutual information. This is just the documentation that was wrong. (G2 and chi2 both return the value of the statitistics and the p-value) |
Beta Was this translation helpful? Give feedback.
-
@phwuil I tried to directly derive mutual information based on a randomly generated bn via the following code below
But I run into this error, shouldn't I use
|
Beta Was this translation helpful? Give feedback.
-
Hi, you found a problem : it seems that (a part of) the state of the inference is kept from an import pyAgrum as gum
import numpy as np
bn = gum.randomBN(n=10) #
mi_matrix = np.zeros((bn.size(), bn.size()))
for i,ni in bn:
for j,nj in bn:
if i<=j:
continue
mi_matrix[i, j] = gum.InformationTheory(gum.LazyPropagation(bn),
ni,nj).mutualInformationXY()
print("Pairwise Mutual Information Matrix:")
print(mi_matrix) (Note the loop in bn that iterates over all the pairs |
Beta Was this translation helpful? Give feedback.
-
Note also that this code has a small problem : it assumes that the nodeids are integers from 1 to n. This is not always the case : However, if you do not use the method |
Beta Was this translation helpful? Give feedback.
Hi, you found a problem : it seems that (a part of) the state of the inference is kept from an
InformationTheory
instance to another. For now, a (not so expensive) workaround is to create theLazyPropagation
object in the loop :(Note the loop in bn that iterates over all the pairs
(node_id,…