0 ×

Keras Thresholded ReLU Layer

KNIME Deep Learning - Keras Integration version 4.2.1.v202008251157 by KNIME AG, Zurich, Switzerland

Similar to ordinary ReLUs but shifted by theta. f(x) = x for x > theta, f(x) = 0 otherwise. Corresponds to the Keras Thresholded ReLU Layer.


Name prefix
The name prefix of the layer. The prefix is complemented by an index suffix to obtain a unique layer name. If this option is unchecked, the name prefix is derived from the layer type.
Threshold location of activation. A theta of 0 corresponds to an ordinary ReLU.

Input Ports

The Keras deep learning network to which to add a Thresholded ReLU layer.

Output Ports

The Keras deep learning network with an added Thresholded ReLU layer.

Best Friends (Incoming)

Best Friends (Outgoing)


To use this node in KNIME, install KNIME Deep Learning - Keras Integration from the following update site:


A zipped version of the software site can be downloaded here. Read our FAQs to get instructions about how to install nodes from a zipped update site.

Wait a sec! You want to explore and install nodes even faster? We highly recommend our NodePit for KNIME extension for your KNIME Analytics Platform.


You want to see the source code for this node? Click the following button and we’ll use our super-powers to find it for you.