Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

Hadoop Configuration in Spark

I need to get hold of the current Hadoop org.apache.hadoop.conf.Configuration in my Spark job, for debugging purposes. Specifically I need to get org.apache.hadoop.fs.FileSystem for a path with the org.apache.hadoop.fs.Path#getFileSystem(conf: Configuration) method

Given a org.apache.spark.SparkContext, is there a way to get the Configuration?

like image 609
Ygg Avatar asked Oct 15 '25 22:10

Ygg


1 Answers

you can set configration as per below code

sc.hadoopConfiguration.set("my.mapreduce.setting","someValue")
like image 94
Sahil Desai Avatar answered Oct 18 '25 13:10

Sahil Desai



Donate For Us

If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!