Note: To avoid an accidental cluster startup, this node creates a dummy DB and Spark port if loaded in executed state from a stored workflow. Reset and execute the node to start the cluster and create a Spark execution context.
Cluster access control: KNIME uploads additional libraries to the cluster. This requires manage cluster-level permissions if your cluster is secured with access control. See the Databricks documentation on how to set up the permission.
Choose the JDBC driver to connect to the database here. If you select "Use latest driver version available" upon execution the node will automatically use the driver with the latest (highest) driver version that is available for the current database type. This has the advantage that you do not need to touch the workflow after a driver update. However, the workflow might break in the rare case that the behavior of the driver e.g. type mapping changes with the newer version.
If this option is not enabled, you can select a specific version of the registered drivers via the drop-down list. Additional drivers downloaded from here can be registered via KNIME's preference page "KNIME -> Databases". For more details on how to register a new driver see the database documentation.
This tab allows you to define JDBC driver connection parameter. The value of a parameter can be a
constant, variable, credential user, credential password or KNIME URL.
The UserAgentEntry parameter is added as default to all Databricks connections to track the usage
of KNIME Analytics Platform as Databricks client. If you are not comfortable sharing this information
with Databricks you can remove the parameter. However, if you want to promote KNIME as a client
with Databricks leave the parameter as is.
For more information about the JDBC driver and the UserAgentEntry, refer to the installation
and configuration guide which you can find in the docs directory of the
driver package.
This tab allows you to define KNIME framework properties such as connection handling, advanced SQL dialect settings or logging options. The available properties depend on the selected database type and driver.
This tab allows you to define rules to map from database types to KNIME types.
This tab allows you to define rules to map from KNIME types to database types.
You want to see the source code for this node? Click the following button and we’ll use our super-powers to find it for you.
To use this node in KNIME, install the extension KNIME Databricks Integration from the below update site following our NodePit Product and Node Installation Guide:
A zipped version of the software site can be downloaded here.
Deploy, schedule, execute, and monitor your KNIME workflows locally, in the cloud or on-premises – with our brand new NodePit Runner.
Try NodePit Runner!Do you have feedback, questions, comments about NodePit, want to support this platform, or want your own nodes or workflows listed here as well? Do you think, the search results could be improved or something is missing? Then please get in touch! Alternatively, you can send us an email to mail@nodepit.com.
Please note that this is only about NodePit. We do not provide general support for KNIME — please use the KNIME forums instead.