Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

Public DBFS root is disabled. Access is denied on path in Databricks community version

I am trying to get familiar with Databricks community edition. I successfully uploaded a table using upload data feature. Now when I try to use the function .show(), it gave me error.

The picture is shown here

enter image description here

It says something like public DBFS root is not available something like that. Any ideas?

like image 341
Reactoo Avatar asked Dec 07 '25 03:12

Reactoo


1 Answers

DBFS or Databricks File System is the legacy way to interact with files in Databricks. In Community or Free edition you only have access to serverless compute. In this serverless compute, access to legacy directory such as Filestore is not allowed. Yes I am aware that some youtube tutorial uses this but you are on serverless environment and that makes things work a little bit differently. You are only allowed access to either databricks-datasets which contains sample dataset, and Volumes which Databricks maintain to expose access to volumes.

Ideally you want to create a managed volume first:

catalog = 'main'
schema = 'default'
volume_name = 'vol1'
spark.sql(f"CREATE VOLUME IF NOT EXISTS {catalog}.{schema}.{volume_name}")

Then upload your dataset to this Volume. Then create a table from the volume like so:

file_name = 'your-file-name.csv'
df = spark.read.format('csv').option('inferSchema', True).option('header', True).load(f'/Volumes/{catalog}/{schema}/{volume_name}/{file_name}')

display(df.head(5))

PS: display(df.head()) will give you more dynamic table versus df.show .

like image 155
addicted Avatar answered Dec 08 '25 23:12

addicted



Donate For Us

If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!