This directory contains 10 workflows.
Amazon S3 Remote File Example In order to connect to the service, Amazon S3 credentials are required. Folders created on root level are created as […]
Azure Blob Store Remote File Example Connect to Azure Blob Storage using a Storage Account and an Access Key Folders created on root level are created […]
This workflow accesses a zipped file from a local URL (knime://knime.workflow/data\2008.csv.bz2) and a zipped file from a remote URL […]
Will They Blend? Amazon S3 meets MS Blob Storage plus Excel The challenge here is to blend S3 formatted data from the Amazon Cloud with Blob Storage […]
SAS SPSS and MATLAB meet S3 The challenge here is to blend SAS, SPSS and Matlab proprietary files on Amazon S3 Cloud Solution. Will they blend? This […]
Amazon S3 meets DynamoDB This workflow demonstrates how a small dataset uploaded to S3 can be used as a basis for DynamoDB table creation and […]
Microsoft SharePoint Data Access This workflow demonstrates the usage of the Microsoft Authentication and SharePoint Online Connector nodes to connect to […]
This workflow accesses data on Google Cloud Storage and on Microsoft Sharepoint, blends the data, and formats the data into a table that is exported into a […]
Data Transfer between Clouds This workflow demonstrates the utilization of the new file system connection nodes within KNIME AP4.3, while reading, […]
You can easily download and run the workflow within your KNIME installation. For optimal performance, we recommend using the latest version of the KNIME […]
Do you have feedback, questions, comments about NodePit, want to support this platform, or want your own nodes or workflows listed here as well? Do you think, the search results could be improved or something is missing? Then please get in touch! Alternatively, you can send us an email to mail@nodepit.com.
Please note that this is only about NodePit. We do not provide general support for KNIME — please use the KNIME forums instead.