Keras Thresholded ReLU Layer

Similar to ordinary ReLUs but shifted by theta. f(x) = x for x > theta, f(x) = 0 otherwise. Corresponds to the Keras Thresholded ReLU Layer.


Name prefix
The name prefix of the layer. The prefix is complemented by an index suffix to obtain a unique layer name. If this option is unchecked, the name prefix is derived from the layer type.
Threshold location of activation. A theta of 0 corresponds to an ordinary ReLU.

Input Ports

The Keras deep learning network to which to add a Thresholded ReLU layer.

Output Ports

The Keras deep learning network with an added Thresholded ReLU layer.


This node has no views


  • No workflows found

Further Links


You want to see the source code for this node? Click the following button and we’ll use our super-powers to find it for you.