โ09-04-2024 02:03 AM
When running a Databricks notebook,
an error occurs stating that SOME_TABLE is not a Delta table.
However, after executing the describe detail command and checking the format,
the table is shown as Delta.
Without taking any specific actions, re-running the notebook resolves the issue,
and the notebook operates normally.
Executed cluster:
DBR 13.3 LTS
Spark 3.4.1
Scala 2.12
Has anyone experienced this issue before?
If you have a fundamental solution to the problem, please share it.
Thank you guys in advance.
โ09-04-2024 02:41 AM
Can you share some code and error messages with us?
โ09-04-2024 03:02 AM
Due to security reasons, I can't share the entire code,
but a function, I'm trying to load and process a table using the following code:
map_df = DeltaTable.forName(spark, schema_table).toDF()
The schema_table variable contains the schema and table name where the error occurs. I've confirmed that there's no typo in the schema or table name, so the issue doesn't seem to be related to that.
Error Messages โผ:
Thank you Witold
โ09-04-2024 03:00 AM - edited โ09-04-2024 03:02 AM
--
โ09-04-2024 03:10 AM
schema_table variable contains the schema and table
forName doesn't expect a schema, of course only a table path.
โ09-04-2024 05:56 PM
yeah I know that
At the first time, That codes didn't work.
Without doing something, I just run again
then That forName worked.
It's so curious
โ09-04-2024 11:55 PM
Another thing you could check is how the underlying data looks like. Maybe the actual writer of the data, messed it up.
Join a Regional User Group to connect with local Databricks users. Events will be happening in your city, and you wonโt want to miss the chance to attend and share knowledge.
If there isnโt a group near you, start one and help create a community that brings people together.
Request a New Group