Icon

01_​Training_​a_​Neural_​Machine_​Translation_​Model

Neural Machine Translation from English to German: Training Workflow

This workflow trains a neural machine translation model on character level using an encoder-decoder LSTM network.

The encoder network reads the input sentence character by character and summarizes the sentence in its state. This state is then used as initial state of the decoder network to produce the translated sentence one character at a time. During prediction, the decoder also recieves its previous output as input to the next time. For training we use a technique called "teacher forcing" i.e. we feed the actual previous character instead of the previous prediction which greatly benefits the training.

This example is an adaptation of the following Keras blog post to KNIME: https://blog.keras.io/a-ten-minute-introduction-to-sequence-to-sequence-learning-in-keras.html

Nodes

Extensions

Links