Keras Thresholded ReLU Layer

Similar to ordinary ReLUs but shifted by theta. f(x) = x for x > theta, f(x) = 0 otherwise. Corresponds to the Keras Thresholded ReLU Layer.

Options

Name prefix
The name prefix of the layer. The prefix is complemented by an index suffix to obtain a unique layer name. If this option is unchecked, the name prefix is derived from the layer type.
Theta
Threshold location of activation. A theta of 0 corresponds to an ordinary ReLU.

Input Ports

Icon
The Keras deep learning network to which to add a Thresholded ReLU layer.

Output Ports

Icon
The Keras deep learning network with an added Thresholded ReLU layer.

Views

This node has no views

Workflows

  • No workflows found

Links

Developers

You want to see the source code for this node? Click the following button and we’ll use our super-powers to find it for you.