10-13-2022 03:55 AM
Hello,
I changed the DBR from 7.2 to 10.4 and I receive the following error: AnalysisException: is not a Delta table.
The table is create , using DELTA. so for sure is a Delta table, even though, I read that I read that from vers. 8 all tables are Delta as default and don't need to write USING DELTA.
What can say me about this error?
AnalysisException: `default`.`stg_data_load` is not a Delta table.;
---------------------------------------------------------------------------
AnalysisException Traceback (most recent call last)
<command-4355583460198494> in <module>
49
50 # Fill staging table
---> 51 df.write.insertInto('STG_DATA_LOAD', overwrite = False)
/databricks/spark/python/pyspark/sql/readwriter.py in insertInto(self, tableName, overwrite)
1147 if overwrite is not None:
1148 self.mode("overwrite" if overwrite else "append")
-> 1149 self._jwrite.insertInto(tableName)
1150
1151 def saveAsTable(self, name, format=None, mode=None, partitionBy=None, **options):
/databricks/spark/python/lib/py4j-0.10.9-src.zip/py4j/java_gateway.py in __call__(self, *args)
1302
1303 answer = self.gateway_client.send_command(command)
-> 1304 return_value = get_return_value(
1305 answer, self.gateway_client, self.target_id, self.name)
10-13-2022 07:03 AM
Hi @JOSELITA MOLTISANTI can you run the following commands and share the output?
table_name = "stg_data_load"
path = spark.sql(f"describe detail {table_name}").select("location").collect()[0][0].replace('dbfs:', '')
dbutils.fs.ls(path)
10-13-2022 11:25 AM
10-13-2022 12:29 PM
10-13-2022 12:36 PM
Also, it would be more helpful if you could provide code snippets as well. I see you are trying to do the write.insertInto command.
Join a Regional User Group to connect with local Databricks users. Events will be happening in your city, and you won’t want to miss the chance to attend and share knowledge.
If there isn’t a group near you, start one and help create a community that brings people together.
Request a New Group