0 ×

Keras ELU Layer

KNIME Deep Learning - Keras Integration version 3.6.0.v201807091039 by KNIME AG, Zurich, Switzerland

Exponential linear units were introduced to alleviate the disadvantages of ReLU and LeakyReLU units, namely to push the mean activation closer to zero while still saturating to a negative value which increases robustness against noise if the unit is in an off state (i.e. the input is very negative). The formula is f(x) = alpha * (exp(x) - 1) for x < 0 and f(x) = x for x >= 0. For the exact details see the corresponding paper. Corresponds to the Keras ELU Layer.

Options

Name prefix
The name prefix of the layer. The prefix is complemented by an index suffix to obtain a unique layer name. If this option is unchecked, the name prefix is derived from the layer type.
Alpha
Scale for the negative factor of the exponential linear unit.

Input Ports

The Keras deep learning network to which to add a ELU layer.

Output Ports

The Keras deep learning network with an added ELU layer.

Update Site

To use this node in KNIME, install KNIME Deep Learning - Keras Integration from the following update site:

Wait a sec! You want to explore and install nodes even faster? We highly recommend our NodePit for KNIME extension for your KNIME Analytics Platform.