Parameter Mutual Information

The nodes computes a value matrix and a paramter list containing information about all possible pairs of parameters selected in the configuration dialog. The algorithm is a histogram based approach described by "Moddemeijer R., A statistic to estimate the variance of the histogram based mutual information estimator based on dependent pairs of observations , Signal Processing, 1999, vol. 75, nr. 1, pp. 51-63"


The method to calculate mutual information:
- unbiased (default)
- biased
- mmse (minimum mean square estimate)
Logarithmic base
The logarithmic base to use for entropy calculation (default is 2).
Number of bins to discretize the data. the default is round(numberOfTableRows^1/3) and is calculated for each parameter, if the input is 0.
Axes linkage
If true the lower and upper bounds are determined using the combined information from x and y vector. The unchecked box leads to seperate calculation of the lower and upper bound for each vector seperatly.
Threshold above which mutual information is considered to indicate similar parameter. This needs to be adjusted according to the binning.
Numerical columns between which the mutuali information is calculated can be selected with the column filter.

Input Ports

input table

Output Ports

parmater list table (aggregated stats)
mutual information matrix


This node has no views


  • No workflows found

Further Links


You want to see the source code for this node? Click the following button and we’ll use our super-powers to find it for you.