Group Mutual Information

The node computes the mutual information between two specified groups for a selected subset of parameter. Typically, in Hight Content Screening, a parameters quality can be judged by the mutual information between the library measurements and the reference measurments. the lower the mutual information value the more independent the distributions and thus the Library carries information the reference does not. The mutual information algorithm is a histogram based approach implemented according: "Moddemeijer R., A statistic to estimate the variance of the histogram based mutual information estimator based on dependent pairs of observations , Signal Processing, 1999, vol. 75, nr. 1, pp. 51-63"


The method to calculate mutual information:
- unbiased (default)
- biased
- mmse (minimum mean square estimate)
Logarithmic base
The logarithmic base to use for entropy calculation (default is 2).
Number of bins to discretize the data. the default is round(numberOfTableRows^1/3) and will be calculated for each parameter if the input is 0.
Axes linkage
If true the lower and upper bounds are determined using the combined information from x and y vector. The unchecked box leads to seperate calculation of the lower and upper bound for each vector seperatly.
Grouping column
Column containing the cathegorical variable to group the measurments.
The Reference is one or a set of negative controls. ideally it contains the most abundant negative control that is used for plate normalization (like DMSO or MOCK).
The library wells should contain measuerements of everything else than the reference (positive controls, library reagents, ...)
Numerical columns between which the mutuali information is calculated can be selected with the column filter.

Input Ports

input table

Output Ports

mutual information table


This node has no views


  • No workflows found



You want to see the source code for this node? Click the following button and we’ll use our super-powers to find it for you.