Icon

03_​Define_​DL_​Network_​with_​Keras_​solution

03 Define DL Network with Keras

This workflow is part of a collection of exercise/solution materials used at a hands on workshop held at German Conference for Bioinformatics (GCB-2020). The title of the workshop is "Binding Preference Prediction using KNIME Analytics Platform and its Keras Deep Learning Integration"

Add three missing layers here Bidirectional LSTM Add three missing layers here Exercise 03: Define a Deep Learning Network with KerasNetwork Architecture - Input Layer (Shape = "101, 4") - Convolution 1D Layer (Filters = 16, Kernel size = 11, Activation function = "ReLu") - Activation Layer (Activation function = "ReLu") - Max Pooling 1D Layer (Pool size = 4, Strides = 3) - Dropout Layer (Drop rate = 0.2, Noise shape=False) - Bidirectional LSTM Layer - LSTM Layer (Units=16, Activation="Tanh", Recurrent Activation="Sigmoid", Go backwards=False) - LSTM Layer (Units=16, Activation="Tanh", Recurrent Activation="Sigmoid", Go backwards=True) - Dropout Layer (Drop rate = 0.2, Noise shape=False) - Dense Layer (Units=16, Activation function = "ReLu") - Dense Layer (Units=2, Activation function = "Softmax") 1. Complete the workflow below to match the deep learning network architecture defined above. (Hint: The node name for a specific layer is usually "Keras <Layer Type>". For missing parameters use the default node settings.) 2. Connect the last Keras Network node to the provided Keras Network Writer node 3. Execute the completed workflow to save the network architecture (not trained) to disk for future use. The data used in this workflow are from the following publication:Xiaoyong Pan, Peter Rijnbeek, Junchi Yan, Hong-Bin Shen. Prediction of RNA-protein sequence and structure binding preferences using deep convolutional and recurrent neuralnetworks. BMC Genomics, 2018, 19:511.Specifically: https://github.com/xypan1232/iDeepS/tree/master/datasets/clip rate 0.2ReLULSTMrate 0.2pool size = 4ReLU 32SoftmaxLSTM - Backinput101 (seq-length) x 4(one-hot)Keras ConcatenateLayer Keras Dropout Layer Keras Convolution1D Layer Keras ActivationLayer Keras LSTM Layer Keras Dropout Layer Keras NetworkWriter Keras Max Pooling1D Layer Keras Dense Layer Keras Dense Layer Keras LSTM Layer Keras Input Layer Add three missing layers here Bidirectional LSTM Add three missing layers here Exercise 03: Define a Deep Learning Network with KerasNetwork Architecture - Input Layer (Shape = "101, 4") - Convolution 1D Layer (Filters = 16, Kernel size = 11, Activation function = "ReLu") - Activation Layer (Activation function = "ReLu") - Max Pooling 1D Layer (Pool size = 4, Strides = 3) - Dropout Layer (Drop rate = 0.2, Noise shape=False) - Bidirectional LSTM Layer - LSTM Layer (Units=16, Activation="Tanh", Recurrent Activation="Sigmoid", Go backwards=False) - LSTM Layer (Units=16, Activation="Tanh", Recurrent Activation="Sigmoid", Go backwards=True) - Dropout Layer (Drop rate = 0.2, Noise shape=False) - Dense Layer (Units=16, Activation function = "ReLu") - Dense Layer (Units=2, Activation function = "Softmax") 1. Complete the workflow below to match the deep learning network architecture defined above. (Hint: The node name for a specific layer is usually "Keras <Layer Type>". For missing parameters use the default node settings.) 2. Connect the last Keras Network node to the provided Keras Network Writer node 3. Execute the completed workflow to save the network architecture (not trained) to disk for future use. The data used in this workflow are from the following publication:Xiaoyong Pan, Peter Rijnbeek, Junchi Yan, Hong-Bin Shen. Prediction of RNA-protein sequence and structure binding preferences using deep convolutional and recurrent neuralnetworks. BMC Genomics, 2018, 19:511.Specifically: https://github.com/xypan1232/iDeepS/tree/master/datasets/clip rate 0.2ReLULSTMrate 0.2pool size = 4ReLU 32SoftmaxLSTM - Backinput101 (seq-length) x 4(one-hot)Keras ConcatenateLayer Keras Dropout Layer Keras Convolution1D Layer Keras ActivationLayer Keras LSTM Layer Keras Dropout Layer Keras NetworkWriter Keras Max Pooling1D Layer Keras Dense Layer Keras Dense Layer Keras LSTM Layer Keras Input Layer

Nodes

Extensions

Links