Conditional Mutual Information
InformationTheory.ConditionalMutualInformation.ConditionalMutualInformationMethod
— TypeConditionalMutualInformationMethod
An abstract type for different methods of calculating conditional mutual information.
InformationTheory.ConditionalMutualInformation.Hist
— TypeHist(; bins_x::Tuple = (-1,), bins_y::Tuple = (-1,), bins_z::Tuple = (-1,))
A method for calculating conditional mutual information using histograms.
Fields
bins_x::Tuple
: A tuple specifying the binning strategy for thex
variable.bins_y::Tuple
: A tuple specifying the binning strategy for they
variable.bins_z::Tuple
: A tuple specifying the binning strategy for thez
variable. If(-1,)
for a variable, the binning is determined automatically byStatsBase.fit
. Otherwise, a tuple of bin edges for each dimension should be provided.
InformationTheory.ConditionalMutualInformation.kNN
— TypekNN(; k::Int = 5)
A method for calculating conditional mutual information using the k-Nearest Neighbors (k-NN) estimator.
Fields
k::Int
: The number of nearest neighbors to consider for each point.
InformationTheory.ConditionalMutualInformation.c_mutual
— Methodc_mutual(method::Hist, x::Tuple, y::Tuple, z::Tuple)
Calculates the conditional mutual information I(X;Y|Z)
using a histogram-based method.
Arguments
method::Hist
: The histogram-based calculation method.x::Tuple
: A tuple of vectors representing the data for variable X.y::Tuple
: A tuple of vectors representing the data for variable Y.z::Tuple
: A tuple of vectors representing the data for variable Z.
Returns
I::Float64
: The calculated conditional mutual information.
Details
The function first fits a histogram to the joint data (x, y, z)
. The probability density function (PDF) is then approximated from the histogram. Finally, the conditional mutual information is calculated from the joint and marginal probabilities. The formula used is: I(X;Y|Z) = sum(p(x,y,z) * log(p(x,y,z) * p(z) / (p(x,z) * p(y,z))))
InformationTheory.ConditionalMutualInformation.c_mutual
— Methodc_mutual(method::kNN, x::Tuple, y::Tuple, z::Tuple)
Calculates the conditional mutual information I(X;Y|Z)
using a k-NN based method.
Arguments
method::kNN
: The k-NN based calculation method.x::Tuple
: A tuple of vectors representing the data for variable X.y::Tuple
: A tuple of vectors representing the data for variable Y.z::Tuple
: A tuple of vectors representing the data for variable Z.
Returns
I::Float64
: The calculated conditional mutual information.
Details
This function uses the k-nearest neighbors (k-NN) algorithm to estimate the conditional mutual information. The estimation is based on the number of neighbors of each point within a certain distance in different subspaces (Z, XZ, YZ). This method is particularly useful for high-dimensional data where histogram-based methods fail.