Parameter Mutual Information

The nodes computes a value matrix and a paramter list containing information about all possible pairs of parameters selected in the configuration dialog. The algorithm is a histogram based approach described by "Moddemeijer R., A statistic to estimate the variance of the histogram based mutual information estimator based on dependent pairs of observations , Signal Processing, 1999, vol. 75, nr. 1, pp. 51-63"

Options

Method
The method to calculate mutual information:
- unbiased (default)
- biased
- mmse (minimum mean square estimate)
Logarithmic base
The logarithmic base to use for entropy calculation (default is 2).
Binning
Number of bins to discretize the data. the default is round(numberOfTableRows^1/3) and is calculated for each parameter, if the input is 0.
Axes linkage
If true the lower and upper bounds are determined using the combined information from x and y vector. The unchecked box leads to seperate calculation of the lower and upper bound for each vector seperatly.
Threshold
Threshold above which mutual information is considered to indicate similar parameter. This needs to be adjusted according to the binning.
Parameters
Numerical columns between which the mutuali information is calculated can be selected with the column filter.

Input Ports

Icon
input table

Output Ports

Icon
parmater list table (aggregated stats)
Icon
mutual information matrix

Views

This node has no views

Workflows

Links

Developers

You want to see the source code for this node? Click the following button and we’ll use our super-powers to find it for you.