Keras Leaky ReLU Layer

A leaky ReLU is a rectified linear unit (ReLU) with a slope in the negative part of its input space. The motivation for leaky ReLUs is that vanilla ReLUs have a gradient of zero in the negative part of their input space which can harm learning. Corresponds to the Keras Leaky ReLU Layer.

Options

Name prefix
The name prefix of the layer. The prefix is complemented by an index suffix to obtain a unique layer name. If this option is unchecked, the name prefix is derived from the layer type.
Alpha
Slope in the negative part of the input space. Usually a small positive value. Setting alpha to 0.0 corresponds to a ReLU, setting alpha to 1.0 corresponds to the identity function.

Input Ports

Icon
The Keras deep learning network to which to add a Leaky ReLU layer.

Output Ports

Icon
The Keras deep learning network with an added Leaky ReLU layer.

Views

This node has no views

Workflows

Links

Developers

You want to see the source code for this node? Click the following button and we’ll use our super-powers to find it for you.