Is it possible to use an instance of Hadoop FileSystem created from any valid hdfs url to be used again for reading and writing different hdfs urls.I have tried the following
String url1 = "hdfs://localhost:54310/file1.txt";
String url2 = "hdfs://localhost:54310/file2.txt";
String url3 = "hdfs://localhost:54310/file3.txt";
//Creating filesystem using url1
FileSystem fileSystem = FileSystem.get(URI.create(url1), conf);
//Using same filesystem with url2 and url3
InputStream in = fileSystem.open(new Path(url2));
OutputStream out = fileSystem.create(new Path(url3));
This works.But will this cause any other issues.
You can certainly create a single FileSystem with your scheme and adress and then get it via the FileSystem.
Configuration conf = new Configuration();
conf.set("fs.default.name","hdfs://localhost:54310");
FileSystem fs = FileSystem.get(conf);
InputStream is = fs.open(new Path("/file1.txt"));
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With