Icon

08_​LSTM_​Network

Time Series Prediction

Time Series Analysis
07. LSTM Network

Summary:
In this exercise we'll define an LSTM Network Architechure to train and deploy on the Time Series.

Instructions:
1) Run the workflow up through the Partitioning node, we'll start from here.

2) Before we can train our network we need to define it's architecture. First place the Keras Input Layer node, this defines the shape of the input data. Use shape [ ? , 200 ] for our 200 lags with a time dimension as well for the LSTM.

3) The next layer will be the LSTM layer, place a Keras LSTM Layer node to represent this.

4) To implement the exponential smoothing model, use the moving average node. Select the simple exponential as the type of moving average and irregular component as the column.

5) To implement the naïve model, use the lag column node, this will duplicate the selected column (irregular component) with some amount of row lag. Set lag interval, and lags both to 1.

6) Attach a Numeric Scorer to the end of steps 3-5 and look at the results.

Data Loading Data Cleaning Network Architechure Create Input Vector Time Series Analysis07. LSTM NetworkSummary:In this exercise we'll define an LSTM Network Architechure to train and deploy on the Time Series. Instructions:1) Run the workflow up through the Partitioning node, we'll start from here.2) Before we can train our network we need to define it's architecture. First place the Keras Input Layer node,this defines the shape of the input data. Use shape [ ? , 200 ] for our 200 lags with a time dimension as well for theLSTM.3) The next layer will be the LSTM layer, place a Keras LSTM Layer node to represent this.4) To implement the exponential smoothing model, use the moving average node. Select the simple exponentialas the type of moving average and irregular component as the column.5) To implement the naïve model, use the lag column node, this will duplicate the selected column (irregularcomponent) with some amount of row lag. Set lag interval, and lags both to 1.6) Attach a Numeric Scorer to the end of steps 3-5 and look at the results. Introducemissingdate timesEnergyusagedataconvertdate/timeinto Date&Time objectstraining withMSE loss function[200] tensor for200 lagged inputs80/20 split[1] tensor foroutputLSTM with 100units and ReLUcombine intolistlag 200 values Timestamp Alignment Missing Value File Reader String to Date&Time Column Filter Line Plot Keras NetworkLearner Keras Input Layer Partitioning Keras Dense Layer Numeric Scorer Keras LSTM Layer Column Aggregator Lag Column Deployment Loop Data Loading Data Cleaning Network Architechure Create Input Vector Time Series Analysis07. LSTM NetworkSummary:In this exercise we'll define an LSTM Network Architechure to train and deploy on the Time Series. Instructions:1) Run the workflow up through the Partitioning node, we'll start from here.2) Before we can train our network we need to define it's architecture. First place the Keras Input Layer node,this defines the shape of the input data. Use shape [ ? , 200 ] for our 200 lags with a time dimension as well for theLSTM.3) The next layer will be the LSTM layer, place a Keras LSTM Layer node to represent this.4) To implement the exponential smoothing model, use the moving average node. Select the simple exponentialas the type of moving average and irregular component as the column.5) To implement the naïve model, use the lag column node, this will duplicate the selected column (irregularcomponent) with some amount of row lag. Set lag interval, and lags both to 1.6) Attach a Numeric Scorer to the end of steps 3-5 and look at the results. Introducemissingdate timesEnergyusagedataconvertdate/timeinto Date&Time objectstraining withMSE loss function[200] tensor for200 lagged inputs80/20 split[1] tensor foroutputLSTM with 100units and ReLUcombine intolistlag 200 values Timestamp Alignment Missing Value File Reader String to Date&Time Column Filter Line Plot Keras NetworkLearner Keras Input Layer Partitioning Keras Dense Layer Numeric Scorer Keras LSTM Layer Column Aggregator Lag Column Deployment Loop

Nodes

Extensions

Links