Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

Reuse Spark session across multiple Spark jobs

I have around 10 Spark jobs where each would do some transformation and load data into Database. The Spark session has to be opened individually for each job and closed and every time initialization consumes time.

Is it possible to create the Spark session only once and re-use the same across multiple jobs ?

like image 248
Spark Avatar asked Oct 16 '25 04:10

Spark


1 Answers

Technically if you use a single Spark Session you will end-up having a single Spark application, because you will have to package and run multiple ETL (Extract, Transform, & Load) within a single JAR file.

If you are running those jobs in production cluster, most likely you are using spark-submit to execute your application jar, which will have to go through initializing phase every-time you submits a job through Spark Master -> Workers in client mode.

In general, having a long running spark session is mostly suitable for prototyping, troubleshooting and debugging purposes, for example a single spark session can be leveraged in spark-shell, or any other interactive development environment, like Zeppelin; but, not with spark-submit as far as I know.

All in all, a couple of design/business questions is worth to consider here; does merging multiple ETL jobs together will generate a code that is easy to sustain, manage and debug? Does it provide the required performance gain? Risk/Cost analysis ? etc.

Hope this would help

like image 94
aelbuni Avatar answered Oct 18 '25 15:10

aelbuni