Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

Changing of tmp directory not working in Spark

Tags:

apache-spark

I wanted to change the tmp directory used by spark, so I had something like that in my spark-submit.

 spark-submit <other parameters> --conf "spark.local.dir=<somedirectory>" <other parameters>

But I am noticing that it has not effect, as Spark still uses the default tmp directory. What am I doing wrong here?

By the way, I am using Spark's standalone cluster.

like image 624
MetallicPriest Avatar asked Oct 30 '25 04:10

MetallicPriest


1 Answers

From https://spark.apache.org/docs/2.1.0/configuration.html

In Spark 1.0 and later spark.local.‌​dir overridden by SPARK_LOCAL_DIRS (Standalone, Mesos) or LOCAL_DIRS (YARN) environment variables set by the cluster manager."

like image 194
KZapagol Avatar answered Nov 03 '25 00:11

KZapagol