When I try to run the command
spark.sql("DROP TABLE IF EXISTS table_to_drop")
and the table does not exist, I get the following error:
AnalysisException: "Table or view 'table_to_drop' not found in database 'null';;\nDropTableCommand `table_to_drop`, true, false, false\n"
The command works when the table does exist.
I am using Python3 in an Azure Databricks notebook, Databricks runtime 5.2 on a high concurrency cluster.
The table was created using the following command:
df.write.option("path", "adl://***.azuredatalakestore.net/delta/table_to_drop").saveAsTable(name="table_to_drop", format="delta")
I thought the whole point of using "IF EXISTS" was to avoid this error. Is this a bug, or is it something I don't understand?