Icon

trees

This category contains 16 nodes.

ADTree (3.6) (legacy) 

Class for generating an alternating decision tree. The basic algorithm is based on: Freund, Y., Mason, L.: The alternating decision tree learning […]

BFTree (3.6) (legacy) 

Class for building a best-first decision tree classifier. This class uses binary split for both nominal and numeric attributes. For missing values, the […]

DecisionStump (3.6) (legacy) 

Class for building and using a decision stump. Usually used in conjunction with a boosting algorithm. Does regression (based on mean-squared error) or […]

FT (3.6) (legacy) 

Classifier for building 'Functional trees', which are classification trees that could have logistic regression functions at the inner nodes and/or leaves. […]

Id3 (3.6) (legacy) 

Class for constructing an unpruned decision tree based on the ID3 algorithm. Can only deal with nominal attributes. No missing values allowed. Empty leaves […]

J48 (3.6) (legacy) 

Class for generating a pruned or unpruned C4.5 decision tree. For more information, see Ross Quinlan (1993). C4.5: Programs for Machine Learning. Morgan […]

J48graft (3.6) (legacy) 

Class for generating a grafted (pruned or unpruned) C4.5 decision tree. For more information, see Geoff Webb: Decision Tree Grafting From the […]

LADTree (3.6) (legacy) 

Class for generating a multi-class alternating decision tree using the LogitBoost strategy. For more info, see Geoffrey Holmes, Bernhard Pfahringer, […]

LMT (3.6) (legacy) 

Classifier for building 'logistic model trees', which are classification trees with logistic regression functions at the leaves. The algorithm can deal with […]

M5P (3.6) (legacy) 

M5Base. Implements base routines for generating M5 Model trees and rules The original algorithm M5 was invented by R. Quinlan and Yong Wang made […]