This workflow demonstrates the usage of the Create Databricks Environment node which allows you to connect to a Databricks Cluster from within KNIME Analystics Platform.
The node provides three output ports that allow you to utilize the existing DB nodes to interact wtih the Databricks DB, the file handling nodes to work with the Databricks File System, and the Spark nodes to visually assemble Spark analytics flows. All of these nodes allow you to push down the data processing into the Databricks cluster.
URL: Databricks on Amazon AWS https://databricks.com/aws
URL: Databricks on Microsoft Azure https://databricks.com/product/azure
To use this workflow in KNIME, download it from the below URL and open it in KNIME:
Download WorkflowDeploy, schedule, execute, and monitor your KNIME workflows locally, in the cloud or on-premises – with our brand new NodePit Runner.
Try NodePit Runner!Do you have feedback, questions, comments about NodePit, want to support this platform, or want your own nodes or workflows listed here as well? Do you think, the search results could be improved or something is missing? Then please get in touch! Alternatively, you can send us an email to mail@nodepit.com.
Please note that this is only about NodePit. We do not provide general support for KNIME — please use the KNIME forums instead.