Skip to content

Information Theory of Bayesian network

This class gathers information theory concepts for subsets named X,Y and Z computed with only one (optimized) inference.

it=pyagrum.InformationTheory(ie,X,Y,Z)

  • Parameters:
    • ie (InferenceEngine) – the inference algorithme to use (for instance, pyagrum.LazyPropagation)
    • X (int or str or iterable [**int or str ]) – a first nodeset
    • Y (int or str or iterable [**int or str ]) – a second nodeset
    • Z ( : int or str or iterable [**int or str ] (**optional )) – a third (an optional) nodeset
import pyagrum as gum
bn=pyagrum.fastBN('A->B<-C<-D->E<-F->G->A')
ie=pyagrum.LazyPropagation(bn)
it=pyagrum.InformationTheory(ie,'A',['B','G'],['C'])
print(f'Entropy(A)={it.entropyX()}'')
print(f'MutualInformation(A;B,G)={it.mutualInformationXY()}')
print(f'MutualInformation(A;B,G| C)={it.mutualInformationXYgivenZ()}')
print(f'VariationOfInformation(A;B,G)={it.variationOfInformationXY()}')
  • Returns: The entropy of nodeset X.
  • Return type: float
  • Return type: float
  • Returns: float : The entropy of nodeset, union of X and Y.
  • Return type: float
  • Return type: float
  • Returns: float : The conditional entropy of nodeset X conditionned by nodeset Y
  • Return type: float
  • Returns: float : The entropy of nodeset X.
  • Return type: float
  • Returns: float : The conditional entropy of nodeset Y conditionned by nodeset X
  • Return type: float
  • Return type: float
  • Returns: float : The conditional mutual information between nodeset X and nodeset Y conditionned by nodeset Z
  • Return type: float
  • Returns: float : The variation of information between nodeset X and nodeset Y