Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

Spark structured streaming with kafka leads to only one batch (Pyspark)

I have the following code and I'm wondering why it generates only one batch:

df = spark.readStream.format("kafka").option("kafka.bootstrap.servers", "IP").option("subscribe", "Topic").option("startingOffsets","earliest").load()
// groupby on slidings windows
query = slidingWindowsDF.writeStream.queryName("bla").outputMode("complete").format("memory").start()

The application is launched with the following parameters:

spark.streaming.backpressure.initialRate 5
spark.streaming.backpressure.enabled True

The kafka topic contains around 11 million messages. I'm expecting that it should at least generate two batches due to the initialRate parameter, but it generates only one. Can anyone tell why spark is processing my code in only one batch?

I'm using Spark 2.2.1 and Kafka 1.0.

like image 344
cronoik Avatar asked Nov 28 '25 19:11

cronoik


1 Answers

That is because spark.streaming.backpressure.initialRate parameter is used only by old Spark Streaming, not Structured Streaming.

Instead, use maxOffsetsPerTrigger: http://spark.apache.org/docs/latest/structured-streaming-kafka-integration.html

BTW, see also this answer: How Spark Structured Streaming handles backpressure?, SSS now don't have full backpressure support

like image 87
T. Gawęda Avatar answered Dec 02 '25 03:12

T. Gawęda



Donate For Us

If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!