Node Connectivity

There are 3042 nodes that can be used as successor for a node with an output port of type Table.

LADTree (3.6) (legacy) 

Class for generating a multi-class alternating decision tree using the LogitBoost strategy. For more info, see Geoffrey Holmes, Bernhard Pfahringer, […]

LMT (3.6) (legacy) 

Classifier for building 'logistic model trees', which are classification trees with logistic regression functions at the leaves. The algorithm can deal with […]

M5P (3.6) (legacy) 

M5Base. Implements base routines for generating M5 Model trees and rules The original algorithm M5 was invented by R. Quinlan and Yong Wang made […]

NBTree (3.6) (legacy) 

Class for generating a decision tree with naive Bayes classifiers at the leaves. For more information, see Ron Kohavi: Scaling Up the Accuracy of […]

REPTree (3.6) (legacy) 

Fast decision tree learner. Builds a decision/regression tree using information gain/variance and prunes it using reduced-error pruning (with backfitting). […]

RandomForest (3.6) (legacy) 

Class for constructing a forest of random trees. For more information see: Leo Breiman (2001). Random Forests. Machine Learning. 45(1):5-32.

RandomTree (3.6) (legacy) 

Class for constructing a tree that considers K randomly chosen attributes at each node. Performs no pruning. Also has an option to allow estimation of […]

SimpleCart (3.6) (legacy) 

Class implementing minimal cost-complexity pruning. Note when dealing with missing values, use "fractional instances" method instead of surrogate split […]

UserClassifier (3.6) (legacy) 

Interactively classify through visual means. You are Presented with a scatter graph of the data against two user selectable attributes, as well as a view of […]

ConjunctiveRule (3.6) (legacy) 

This class implements a single conjunctive rule learner that can predict for numeric and nominal class labels. A rule consists of antecedents "AND"ed […]