Node Connectivity

There are 2738 nodes that can be used as successor for a node with an output port of type Table.

FT (3.6) (legacy) 

Classifier for building 'Functional trees', which are classification trees that could have logistic regression functions at the inner nodes and/or leaves. […]

Id3 (3.6) (legacy) 

Class for constructing an unpruned decision tree based on the ID3 algorithm. Can only deal with nominal attributes. No missing values allowed. Empty leaves […]

J48 (3.6) (legacy) 

Class for generating a pruned or unpruned C4.5 decision tree. For more information, see Ross Quinlan (1993). C4.5: Programs for Machine Learning. Morgan […]

J48graft (3.6) (legacy) 

Class for generating a grafted (pruned or unpruned) C4.5 decision tree. For more information, see Geoff Webb: Decision Tree Grafting From the […]

LADTree (3.6) (legacy) 

Class for generating a multi-class alternating decision tree using the LogitBoost strategy. For more info, see Geoffrey Holmes, Bernhard Pfahringer, […]

LMT (3.6) (legacy) 

Classifier for building 'logistic model trees', which are classification trees with logistic regression functions at the leaves. The algorithm can deal with […]

M5P (3.6) (legacy) 

M5Base. Implements base routines for generating M5 Model trees and rules The original algorithm M5 was invented by R. Quinlan and Yong Wang made […]

NBTree (3.6) (legacy) 

Class for generating a decision tree with naive Bayes classifiers at the leaves. For more information, see Ron Kohavi: Scaling Up the Accuracy of […]

REPTree (3.6) (legacy) 

Fast decision tree learner. Builds a decision/regression tree using information gain/variance and prunes it using reduced-error pruning (with backfitting). […]

RandomForest (3.6) (legacy) 

Class for constructing a forest of random trees. For more information see: Leo Breiman (2001). Random Forests. Machine Learning. 45(1):5-32.