LMT (3.6) (legacy)

Classifier for building 'logistic model trees', which are classification trees with logistic regression functions at the leaves. The algorithm can deal with binary and multi-class target variables, numeric and nominal attributes and missing values. For more information see: Niels Landwehr, Mark Hall, Eibe Frank (2005). Logistic Model Trees. Machine Learning. 95(1-2):161-205. Marc Sumner, Eibe Frank, Mark Hall: Speeding up Logistic Model Tree Induction. In: 9th European Conference on Principles and Practice of Knowledge Discovery in Databases, 675-683, 2005.

(based on WEKA 3.6)

For further options, click the 'More' - button in the dialog.

All weka dialogs have a panel where you can specify classifier-specific parameters.

Options

Class column
Choose the column that contains the target variable.
Preliminary Attribute Check

The Preliminary Attribute Check tests the underlying classifier against the DataTable specification at the inport of the node. Columns that are compatible with the classifier are marked with a green 'ok'. Columns which are potentially not compatible are assigned a red error message.

Important: If a column is marked as 'incompatible', it does not necessarily mean that the classifier cannot be executed! Sometimes, the error message 'Cannot handle String class' simply means that no nominal values are available (yet). This may change during execution of the predecessor nodes.

Capabilities: [Nominal attributes, Binary attributes, Unary attributes, Empty nominal attributes, Numeric attributes, Date attributes, Missing values, Nominal class, Binary class, Missing class values] Dependencies: [] min # Instance: 1

Classifier Options

B: Binary splits (convert nominal attributes to binary ones)

R: Split on residuals instead of class values

C: Use cross-validation for boosting at all nodes (i.e., disable heuristic)

P: Use error on probabilities instead of misclassification error for stopping criterion of LogitBoost.

I: Set fixed number of iterations for LogitBoost (instead of using cross-validation)

M: Set minimum number of instances at which a node can be split (default 15)

W: Set beta for weight trimming for LogitBoost. Set to 0 (default) for no weight trimming.

A: The AIC is used to choose the best iteration.

Input Ports

Icon
Training data

Output Ports

Icon
Trained classifier

Popular Predecessors

  • No recommendations found

Popular Successors

  • No recommendations found

Views

Weka Node View
Each Weka node provides a summary view that provides information about the classification. If the test data contains a class column, an evaluation is generated.

Workflows

  • No workflows found

Links

Developers

You want to see the source code for this node? Click the following button and we’ll use our super-powers to find it for you.