Is it advisable to use spark._jsparkSession.catalog().tableExists() to check the table exists in spark for databricks delta-table in pyspark.
The question arises because _jsparkSession is a internal attribute in pyspark? and user should not access?
it could be used because there is no parity between Scala & Python APIs. But you can use something like this to check if the table exists:
def table_exists(table: str, database: str = "default") -> bool:
tbl = spark.sql(f"show tables in `{database}`") \
.filter(f"tableName = '{table}'")
return tbl.count() > 0
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With