Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

Is there a way to set multiple --conf as job parametet in AWS Glue?

Im trying to configure spark in my Glue jobs. When I tried to input them one by one in the 'Edit job', 'Job Parameters' as key and valur pair (e.g. key:--conf value: spark.executor.memory=10g) it works but when I tried putting them altogether (delimited by space or comma), it results to an error. I also tried using sc._conf.setAll but Glue is ignoring the config and insists on using its default. Is there a way to do this with Spark 2.4?

like image 336
tn.splinter Avatar asked Dec 06 '25 05:12

tn.splinter


1 Answers

Yes, you can pass multiple parameters as below:

Key: --conf

value: spark.yarn.executor.memoryOverhead=7g --conf spark.yarn.executor.memory=7g

like image 63
Nihir Avatar answered Dec 08 '25 20:12

Nihir



Donate For Us

If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!