Icon

05_​Hive_​to_​Spark_​to_​Hive

Hive to Spark and Spark to Hive

This workflow demonstrates the usage of the Hive to Spark and Spark to Hive nodes that allow you to transfer data between Apache Spark and Apache Hive.

To run this workflow on a remote cluster, use an HDFS Connection node, Hive Connector node, and Create Spark Context (Livy) node (available in the KNIME Big Data Connectors Extension) in place of the Create Local Big Data Environment node.

Hive to Spark to Hive This workflow demonstrates the usage of the Hive to Spark and Spark to Hive nodes that allow you to transfer data between Apache Spark and Apache Hive. Convert Hive queryto Spark DataFrameConvert Spark DataFrameto Hive tableload datainto KNIMEtest datacreate the test tableload thetest table Hive to Spark Spark to Hive DB Reader File Reader Create Local BigData Environment DB Table Creator DB Loader Hive to Spark to Hive This workflow demonstrates the usage of the Hive to Spark and Spark to Hive nodes that allow you to transfer data between Apache Spark and Apache Hive. Convert Hive queryto Spark DataFrameConvert Spark DataFrameto Hive tableload datainto KNIMEtest datacreate the test tableload thetest tableHive to Spark Spark to Hive DB Reader File Reader Create Local BigData Environment DB Table Creator DB Loader

Nodes

Extensions

Links