Spark to ORC

This Node Is Deprecated — This version of the node has been replaced with a new and improved version. The old version is kept for backwards-compatibility, but for all new workflows we suggest to use the version linked below.
Go to Suggested ReplacementSpark to ORC
Converts an incoming Spark DataFrame/RDD into a ORC table.

Notice: This feature requires at least Apache Spark 1.5.

Options

Save mode
How to handle existing data.
Partitions
Overwrite default partition count. This can be useful to reduce output file count to e.g. one file.
Warning: This might result in serious performance issues on huge data sets. Use with caution!
See Spark documentation for more informations.

Input Ports

Icon
Spark compatible connection (HDFS, WebHDFS, HttpFS, S3, Blob Storage, ...)
Icon
Spark DataFrame/RDD

Output Ports

This node has no output ports

Popular Predecessors

  • No recommendations found

Popular Successors

  • No recommendations found

Views

This node has no views

Workflows

  • No workflows found

Links

Developers

You want to see the source code for this node? Click the following button and we’ll use our super-powers to find it for you.