Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

How can I append to same file in HDFS(spark 2.11)

I am trying to store Stream Data into HDFS using SparkStreaming,but it Keep creating in new file insted of appending into one single file or few multiple files

If it keep creating n numbers of files,i feel it won't be much efficient

HDFS FILE SYSYTEM enter image description here

Code

lines.foreachRDD(f => {
  if (!f.isEmpty()) {
    val df = f.toDF().coalesce(1)
    df.write.mode(SaveMode.Append).json("hdfs://localhost:9000/MT9")
  }
 })

In my pom I am using respective dependencies:

  • spark-core_2.11
  • spark-sql_2.11
  • spark-streaming_2.11
  • spark-streaming-kafka-0-10_2.11
like image 580
andani Avatar asked Oct 23 '25 16:10

andani


1 Answers

As you already realized Append in Spark means write-to-existing-directory not append-to-file.

This is intentional and desired behavior (think what would happen if process failed in the middle of "appending" even if format and file system allow that).

Operations like merging files should be applied by a separate process, if necessary at all, which ensures correctness and fault tolerance. Unfortunately this requires a full copy which, for obvious reasons is not desired on batch-to-batch basis.

like image 151
user9988523 Avatar answered Oct 25 '25 06:10

user9988523