Keras Alpha Dropout Layer

Applies Alpha Dropout to the input. Alpha Dropout is a Dropout that keeps mean and variance of inputs to their original values, in order to ensure the self-normalizing property even after this dropout. Alpha Dropout fits well to Scaled Exponential Linear Units by randomly setting activations to the negative saturation value. Corresponds to the Keras Alpha Dropout Layer.


Name prefix
The name prefix of the layer. The prefix is complemented by an index suffix to obtain a unique layer name. If this option is unchecked, the name prefix is derived from the layer type.
Drop rate
The drop probability (as with Dropout). The multiplicative noise will have standard deviation sqrt(rate / (1 - rate)).
Noise shape
The shape of the binary dropout mask that will be multiplied with the input. The noise shape has to include the batch dimension which means in case of 2D images with shape [height, width, channels], the noise shape must must have rank 4 i.e. [batch, height, width, channels]. In order to reuse the dropout mask along specific dimensions set those to '1'. Spatial dropout where whole feature maps are dropped can be achieved by setting noise shape to 'batch_size, 1, 1, feature_dim_size'.
Random seed
Random seed to use for the dropping.

Input Ports

The Keras deep learning network to which to add an Alpha Dropout layer.

Output Ports

The Keras deep learning network with an added Alpha Dropout layer.


This node has no views


  • No workflows found



You want to see the source code for this node? Click the following button and we’ll use our super-powers to find it for you.