So, Mutual Information is comparable to covariance, and Entropy is analogous to variance.
For a given network, BayesiaLab can report the Symmetric Relative Mutual Information in several contexts:
Main Menu > Analysis > Report > Relationship Analysis
:
Note that the corresponding options under Preferences > Analysis > Visual Analysis > Arc's Mutual Information Analysis
have to be selected first:
In Preferences, Child refers to the Relative Mutual Information from the Parent onto the Child node, i.e., in the direction of the arc.
Conversely, Parent refers to the Relative Mutual Information from the Child onto the Parent node, i.e., in the opposite direction of the arc.
Symmetric Relative Mutual Information computes the percentage of information gained by observing and :
This normalization is calculated similarly to Pearson's Correlation Coefficient .
where denotes variance.
The Symmetric Normalized Mutual Information can also be shown by selecting Main Menu > Analysis > Visual > Overall > Arc > Mutual Information
and then clicking the Show Arc Comments icon or selecting Main Menu > View > Show Arc Comments
.