Icon

Knime Project 471 Complete

02_LSTM_Network
Data Cleaning Time Series AnalysisLSTM NetworkSummary:In this workflow we'll define an LSTM Network Architecture to train and deploy on theTime Series. Instructions:1) Run the workflow up through the Partitioning node, we'll start from here2) Before we can train our network we need to define its architecture. First place theKeras Input Layer node, this defines the shape of the input data. Use shape [ ? , 200 ]for our 200 lagged values.3) The next layer will be the LSTM layer, place a Keras LSTM Layer node to representthis. Use 100 units, activation function ReLU, and recurrent activation function Sigmoid.4) Place the Keras Dense Layer as the last layer in the network architecture. Use 1 unitand activation function ReLU.5) To implement the LSTM model, use the Keras Network Learner node and train thenetwork on the training set, i.e. the bottom output of the Partitioning node. Use theList[...] column as input column, cluster_26 as target column, and MSE as lossfunction.6) Forecast values in the test set with the Deployment Loop component. Check theperformance with the Numeric Scorer and Line Plot. Network Architecture Create Input Vector Data Ouput training withMSE loss function[200] tensor for200 lagged inputs80/20 split[1] tensor foroutputLSTM with 100units and ReLUcombine intolistlag 200 valuesNode 449 Column Filter Line Plot Keras NetworkLearner Keras Input Layer Partitioning Keras Dense Layer Keras LSTM Layer Column Aggregator Lag Column Deployment Loop Excel Reader Data Cleaning Time Series AnalysisLSTM NetworkSummary:In this workflow we'll define an LSTM Network Architecture to train and deploy on theTime Series. Instructions:1) Run the workflow up through the Partitioning node, we'll start from here2) Before we can train our network we need to define its architecture. First place theKeras Input Layer node, this defines the shape of the input data. Use shape [ ? , 200 ]for our 200 lagged values.3) The next layer will be the LSTM layer, place a Keras LSTM Layer node to representthis. Use 100 units, activation function ReLU, and recurrent activation function Sigmoid.4) Place the Keras Dense Layer as the last layer in the network architecture. Use 1 unitand activation function ReLU.5) To implement the LSTM model, use the Keras Network Learner node and train thenetwork on the training set, i.e. the bottom output of the Partitioning node. Use theList[...] column as input column, cluster_26 as target column, and MSE as lossfunction.6) Forecast values in the test set with the Deployment Loop component. Check theperformance with the Numeric Scorer and Line Plot. Network Architecture Create Input Vector Data Ouput training withMSE loss function[200] tensor for200 lagged inputs80/20 split[1] tensor foroutputLSTM with 100units and ReLUcombine intolistlag 200 valuesNode 449 Column Filter Line Plot Keras NetworkLearner Keras Input Layer Partitioning Keras Dense Layer Keras LSTM Layer Column Aggregator Lag Column Deployment Loop Excel Reader

Nodes

Extensions

Links