0 ×

Keras PReLU Layer

DeprecatedKNIME Deep Learning - Keras Integration version 4.1.0.v201911110939 by KNIME AG, Zurich, Switzerland

Like the leaky ReLU, the parametric ReLU introduces a slope in the negative part of the input space to improve learning dynamics compared to ordinary ReLUs. The difference to leaky ReLUs is that here the slope alpha is treated as a parameter that is trained alongside the rest of the network's weights. Alpha is usually a vector containing a dedicated slope for each feature of the input. (also see the Shared axes option). Corresponds to the Keras PReLU Layer.

Options

Name prefix
The name prefix of the layer. The prefix is complemented by an index suffix to obtain a unique layer name. If this option is unchecked, the name prefix is derived from the layer type.
Alpha initializer
The initializer for alpha, usually zero or a small negative number.
Alpha regularizer
An optional regularizer for alpha.
Alpha constraint
An optional constraint on alpha.
Shared axes
Optional list of axes along which to share alpha. For example, in a 2D convolution with input shape (batch, height, width, channels) it is common to have an alpha per channel and share the alpha across spatial dimensions. In this case one would set the shared axes to "1, 2".

Input Ports

The Keras deep learning network to which to add a PReLU layer.

Output Ports

The Keras deep learning network with an added PReLU layer.

Installation

To use this node in KNIME, install KNIME Deep Learning - Keras Integration from the following update site:

KNIME 4.1
Wait a sec! You want to explore and install nodes even faster? We highly recommend our NodePit for KNIME extension for your KNIME Analytics Platform.

Developers

You want to see the source code for this node? Click the following button and we’ll use our super-powers to find it for you.