Hello,
I changed the DBR from 7.2 to 10.4 and I receive the following error: AnalysisException: is not a Delta table.
The table is create , using DELTA. so for sure is a Delta table, even though, I read that I read that from vers. 8 all tables are Delta as default and don't need to write USING DELTA.
What can say me about this error?
AnalysisException: `default`.`stg_data_load` is not a Delta table.;
---------------------------------------------------------------------------
AnalysisException Traceback (most recent call last)
<command-4355583460198494> in <module>
49
50 # Fill staging table
---> 51 df.write.insertInto('STG_DATA_LOAD', overwrite = False)
/databricks/spark/python/pyspark/sql/readwriter.py in insertInto(self, tableName, overwrite)
1147 if overwrite is not None:
1148 self.mode("overwrite" if overwrite else "append")
-> 1149 self._jwrite.insertInto(tableName)
1150
1151 def saveAsTable(self, name, format=None, mode=None, partitionBy=None, **options):
/databricks/spark/python/lib/py4j-0.10.9-src.zip/py4j/java_gateway.py in __call__(self, *args)
1302
1303 answer = self.gateway_client.send_command(command)
-> 1304 return_value = get_return_value(
1305 answer, self.gateway_client, self.target_id, self.name)