This directory contains 400 workflows.
Create a Big Data table, show the structiure and fields
Hive - upload CSV GZIP files to HDFS and bring them together as an EXTERNAL table You have a gzipped CSV file that you would upload to a HDFS folder. That […]
Hive - how to get from DB-Connectors to Hive (or Impala) tables
Hive - how to get from DB-Connectors to Hive (or Impala) tables There does not seem to be a direct way to get from the comfortable (brown) Database nodes […]
Hive - how to get from DB-Connectors to Hive (or Impala) tables - KNIME 4.3+ Hive - how to get from DB-Connectors to Hive (or Impala) tables There does […]
Hive - how to get from DB-Connectors to Hive (or Impala) tables - KNIME 4.5+ There does not seem to be a direct way to get from the comfortable (brown) […]
use try and catch for generic ports. The existing table will be at the resulting port Try some generic SQL operation on your Hive (or Impala) environment. […]
Hive - upload data in several ORC files to HDFS and bring them together as an EXTERNAL table You have several ORC files with the same structure that you […]
School of Hive - with KNIME's local Big Data environment (SQL for Big Data) Demonstrates a collection of Hive functions using KNIME's local Big Data […]
Using Java and Rule engine and If-Switch to decide if a column is present and active which branch of a workflow should be executed
Do you have feedback, questions, comments about NodePit, want to support this platform, or want your own nodes or workflows listed here as well? Do you think, the search results could be improved or something is missing? Then please get in touch! Alternatively, you can send us an email to mail@nodepit.com.
Please note that this is only about NodePit. We do not provide general support for KNIME — please use the KNIME forums instead.