IconRProp MLP Learner0 ×

DeprecatedKNIME Base Nodes version 3.6.0.v201807061308 by KNIME AG, Zurich, Switzerland

Implementation of the RProp algorithm for multilayer feedforward networks. RPROP performs a local adaptation of the weight-updates according to the behavior of the error function. For further details see: Riedmiller, M. Braun, H. : "A direct adaptive method for faster backpropagation learning: theRPROP algorithm",Proceedings of the IEEE International Conference on Neural Networks (ICNN) (Vol. 16, pp. 586-591). Piscataway, NJ: IEEE. This node provides a view of the error plot.
If the optional PMML inport is connected and contains preprocessing operations in the TransformationDictionary those are added to the learned model.


Maximum number of iterations
The number of learning iterations.
Number of hidden layers
Specifies the number of hidden layers in the architecture of the neural network.
Number of hidden neurons per layer
Specifies the number of neurons contained in each hidden layer.
Class column
Choose the column that contains the target variable: it can either be nominal or numerical. All nominal class values are extracted and assigned to output neurons. If you use a numerical target variable (regression), please make sure it is normalized!
Ignore missing values
If this checkbox is set, rows with missing values will not be used for training.
Use seed for random initialization
If this checkbox is set, a seed (see next field) can be set for initializing the weights and thresholds can be set.
Random seed
Seed for the random number generator.

Input Ports

Datatable with training data
Optional PMML port object containing preprocessing operations.

Output Ports

RProp trained Neural Network


Error Plot
Displays the error for each iteration.


Update Site

To use this node in KNIME, install KNIME Base Nodes from the following update site:

Wait a sec! You want to explore and install nodes even faster? We highly recommend our NodePit for KNIME extension for your KNIME Analytics Platform.