I need to get hold of the current Hadoop org.apache.hadoop.conf.Configuration
in my Spark job, for debugging purposes. Specifically I need to get org.apache.hadoop.fs.FileSystem
for a path with the org.apache.hadoop.fs.Path#getFileSystem(conf: Configuration)
method
Given a org.apache.spark.SparkContext
, is there a way to get the Configuration
?
you can set configration as per below code
sc.hadoopConfiguration.set("my.mapreduce.setting","someValue")
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With