BERT Multi-label Classification Learner

The node uses BERT model and adds a predefined neural network on top. There are 3 layers added:

  • GlobalAveragePooling1D layer
  • Dropout layer
  • Dense layer
The trained model can be applied for multi-class classification.

Options

Settings

Sentence column
A column with a plain text (String), that contains text to be classified. No special pre-processing is needed.
Class column
A column that contains class labels.
Max sequence length
The maximum length of a sequence after tokenization, limit is 512.

Advanced

Number of epochs
The number of epochs used for training the classifier.
Batch size
The size of a chunk of the input data used for model update.
Validation batch size
The size of a chunk of the validation data to process.
Fine tune BERT
If checked than BERT model will be trained along with the additional classifier. It takes longer time to fine tune BERT, but the results are usually better.
Optimizer
Available optimizers and their configuration.

Python

Python
Select one of Python execution environment options:
  • use default Python environment for Deep Learning
  • use Conda environment

Input Ports

Icon
BERT Model
Icon
Data Table
Icon
Validation Table

Output Ports

Icon
BERT Classifier model
Icon
Stats

Views

This node has no views

Workflows

Links

Developers

You want to see the source code for this node? Click the following button and we’ll use our super-powers to find it for you.