Keras PReLU Layer

This Node Is Deprecated — This version of the node has been replaced with a new and improved version. The old version is kept for backwards-compatibility, but for all new workflows we suggest to use the version linked below.
Go to Suggested ReplacementKeras PReLU Layer

Like the leaky ReLU, the parametric ReLU introduces a slope in the negative part of the input space to improve learning dynamics compared to ordinary ReLUs. The difference to leaky ReLUs is that here the slope alpha is treated as a parameter that is trained alongside the rest of the network's weights. Alpha is usually a vector containing a dedicated slope for each feature of the input. (also see the Shared axes option). Corresponds to the Keras PReLU Layer.

Options

Name prefix
The name prefix of the layer. The prefix is complemented by an index suffix to obtain a unique layer name. If this option is unchecked, the name prefix is derived from the layer type.
Alpha initializer
The initializer for alpha, usually zero or a small negative number.
Alpha regularizer
An optional regularizer for alpha.
Alpha constraint
An optional constraint on alpha.
Shared axes
Optional list of axes along which to share alpha. For example, in a 2D convolution with input shape (batch, height, width, channels) it is common to have an alpha per channel and share the alpha across spatial dimensions. In this case one would set the shared axes to "1, 2".

Input Ports

Icon
The Keras deep learning network to which to add a PReLU layer.

Output Ports

Icon
The Keras deep learning network with an added PReLU layer.

Views

This node has no views

Workflows

  • No workflows found

Links

Developers

You want to see the source code for this node? Click the following button and we’ll use our super-powers to find it for you.