Calculates the entropy uncertainty score of a class probability distribution. Input are rows containing class probabilities P = p1, p2, ..., pn that must sum up to 1. Output will be the normalized Shannon entropy . This is defined by E(P) = H(P) / log(n) with H(P) = - sum(p_i*log(p_i) for each i in 1,...,n. The logarithm with base 2 is used. The normalization leads always to values between 0 and 1. A uniform probability distribution (i.e., most uncertain as all probabilities are equal to each other) has an entropy value of 1. If one of the class probabilities is 1 and the others 0, the highest certainty is given and the entropy value will be 0.
You want to see the source code for this node? Click the following button and we’ll use our super-powers to find it for you.
To use this node in KNIME, install the extension KNIME Active Learning from the below update site following our NodePit Product and Node Installation Guide:
A zipped version of the software site can be downloaded here.
Deploy, schedule, execute, and monitor your KNIME workflows locally, in the cloud or on-premises – with our brand new NodePit Runner.
Try NodePit Runner!