There are 27 nodes that can be used as predessesor for a node with an input port of type File System.
Connects to Databricks File System (DBFS) in order to read/write files in downstream nodes.
Databricks Unity File System Connector node.
Connects to remote file system via FTP in order to read/write files in downstream nodes.
Provides a file system connection to an S3-compatible endpoint.
Creates a Hadoop Distributed File System (HDFS) connection in order to read/write files in downstream nodes.
Connects to HDFS using an Apache KNOX gateway in order to read/write files in downstream nodes.
Connects to a web server with HTTP(S) in order to read/write files in downstream nodes.
Provides a file system connection with access to the file system of the local machine.
Connects to an SMB server (e.g. Samba, or Windows Server) in order to read/write files in downstream nodes.
Connects to remote file system via SSH in order to read/write files in downstream nodes.
Do you have feedback, questions, comments about NodePit, want to support this platform, or want your own nodes or workflows listed here as well? Do you think, the search results could be improved or something is missing? Then please get in touch! Alternatively, you can send us an email to mail@nodepit.com.
Please note that this is only about NodePit. We do not provide general support for KNIME — please use the KNIME forums instead.