Icon

01_​Big_​Data_​Connectors

This directory contains 10 workflows.

Icon01_​Big_​Data_​Preprocessing_​Example 

This workflow demonstrates the usage of the DB nodes in conjunction with the Create Local Big Data Environment node, which is part of the KNIME Big Data […]

Icon02_​HDFS_​and_​File_​Handling_​Example 

This workflow demonstrates the HDFS file handling capabilites using the file handling nodes in conjunction with an HDFS connection. To run this workflow […]

Icon03_​DatabricksExample 

This workflow demonstrates the usage of the Create Databricks Environment node which allows you to connect to a Databricks Cluster from within KNIME […]

Icon04_​GoogleCloudExample 

This workflow demonstrates how to connect to various Google Cloud Services such as Google BigQuery, Google Dataproc, and Google Cloud Storage from within […]

Icon05_​Austin bike sharing with Google BigQuery 

Austin Bike-sharing with Google BigQuery This workflow performs a connection to the Austin Bikeshare dataset, hosted among the Google BigQuery public […]

Icon06_​Connecting_​to_​Databricks 

Connecting to Databricks This workflow shows how to connect to a Databricks cluster and utilize various KNIME nodes to interact with Databricks from […]

Icon07_​Will_​They_​Blend_​BigQuery_​Databricks 

Google BigQuery meets Databricks This workflow connects to the Austin Bikeshare dataset, hosted among the Google BigQuery public datasets and a […]

Icon08_​Connecting_​to_​Amazon_​EMR 

Connecting to Amazon EMR This workflow demonstrates how to create a Spark context via Apache Livy and execute a simple Spark job on an Amazon EMR […]

Icon09_​AzureExample 

This workflow demonstrates how to connect to various Azure services such as HDInsight clusters, Azure Blob Storage, and AzureSQL from within KNIME Analytics […]

Icon10_​Connecting_​to_​SMB 

Operations on remote File System using SMB Connector This workflow connects to a remote File System using the SMB Connector node. It performs some […]