Icon

01_​Big_​Data_​Connectors

This directory contains 11 workflows.

Icon01_​Big_​Data_​Preprocessing_​Example 

This workflow demonstrates the usage of the DB nodes in conjunction with the Create Local Big Data Environment node, which is part of the KNIME Big Data […]

Icon02_​HDFS_​and_​File_​Handling_​Example 

This workflow demonstrates the HDFS file handling capabilites using the file handling nodes in conjunction with an HDFS connection. To run this workflow […]

Icon03_​DatabricksExample 

This workflow demonstrates the usage of the Create Databricks Environment node which allows you to connect to a Databricks Cluster from within KNIME […]

Icon04_​GoogleCloudExample 

URL: Google Cloud Dataproc https://cloud.google.com/dataproc/ URL: Apache Livy Initialization Action […]

Icon05_​Austin bike sharing with Google BigQuery 

URL: Tutorial: Importing Data from Google BigQuery https://www.knime.com/blog/tutorial-importing-data-from-google-bigquery

Icon06_​Connecting_​to_​Databricks 

This workflow shows how to connect to a Databricks cluster and utilize various KNIME nodes to interact with Databricks from within KNIME Analytics […]

Icon07_​Will_​They_​Blend_​BigQuery_​Databricks 

Google BigQuery meets Databricks This workflow connects to the Austin Bikeshare dataset, hosted among the Google BigQuery public datasets and a […]

Icon08_​Connecting_​to_​Amazon_​EMR 

Connecting to Amazon EMR This workflow demonstrates how to create a Spark context via Apache Livy and execute a simple Spark job on an Amazon EMR […]

Icon09_​AzureExample 

This workflow demonstrates how to connect to various Azure services such as HDInsight clusters, Azure Blob Storage, and AzureSQL from within KNIME Analytics […]

Icon10_​Connecting_​to_​SMB 

Operations on remote File System using SMB Connector This workflow connects to a remote File System using the SMB Connector node. It performs some […]