Icon

05_​Hive_​to_​Spark_​to_​Hive

Hive to Spark and Spark to Hive

This workflow demonstrates the usage of the Hive to Spark and Spark to Hive nodes that allow you to transfer data between Apache Spark and Apache Hive.

To run this workflow on a remote cluster, use an HDFS Connection node, Hive Connector node, and Create Spark Context (Livy) node (available in the KNIME Big Data Connectors Extension) in place of the Create Local Big Data Environment node.

Hive to Spark to Hive This workflow demonstrates the usage of the Hive to Spark and Spark to Hive nodes that allow you to transfer data between Apache Spark and Apache Hive. Convert Hive queryto Spark DataFrameConvert Spark DataFrameto Hive tableload datainto KNIMELoad datatest datacreate the test table Hive to Spark Spark to Hive DB Reader Create Local BigData Environment DB Loader File Reader(Complex Format) DB Table Creator Hive to Spark to Hive This workflow demonstrates the usage of the Hive to Spark and Spark to Hive nodes that allow you to transfer data between Apache Spark and Apache Hive. Convert Hive queryto Spark DataFrameConvert Spark DataFrameto Hive tableload datainto KNIMELoad datatest datacreate the test tableHive to Spark Spark to Hive DB Reader Create Local BigData Environment DB Loader File Reader(Complex Format) DB Table Creator

Nodes

Extensions

Links