โ12-02-2022 09:25 AM
I have the following code in a notebook. It is randomly giving me the error, "At least one column must be specified for the table." The error occurs (if at all it occurs) only on the first run after attaching to a cluster.
Cluster details:
Summary
5-10 Workers
320-640 GB Memory40-80 Cores1 Driver
64 GB Memory, 8 Cores Runtime
10.4.x-scala2.12 Apache Spark 3.2.1
Any ideas?
โ12-07-2022 01:03 AM
Create a support request databricks might help you in this issue.
โ12-07-2022 08:39 AM
The issue occurs randomly. The challenge is to recreate the issue for the support team to look. I am hoping that the folks who have experienced similar error, would comment, and then maybe the DBR folks would have something to investigate.
โ12-28-2022 04:20 AM
โ12-28-2022 06:09 AM
Sorry, no solution yet.
โ01-30-2023 09:53 AM
I tried reproducing the issue in Databricks notebook, using 10.4 cluster and ran a few times. Unfortunately couldn't reproduce the issue. It runs successfully during each run. What is the frequency of this intermittent issue? If you re-run the command 10 times would it throw error once? Still would recommend to file a support ticket , so that we can take a deeper look at this.
โ01-30-2023 10:28 AM
So, basically, I have addressed the issue (for now) by putting the culprit statement in a try/catch block with a few retries. The error still occurs but clears in the second retry.
โ01-30-2023 10:11 AM
Are you having multiple threads that runs this statements concurrently ? If so the race condition could cause this issue, when trying to update the metastore.
โ01-30-2023 10:29 AM
I am not using any threading at all.
โ01-30-2023 10:19 AM
If you simply want to get rid of the table, you can drop the table using hive client, as well
https://learn.microsoft.com/en-us/azure/databricks/kb/metastore/drop-table-corruptedmetadata
โ01-30-2023 10:31 AM
I just wanted to simplify the code for illustration purposes. In my case the error occurs at the insert statement after the ALTER TABLE statement.
โ06-05-2023 06:21 PM
Please check if this could help or not:
spark.databricks.delta.catalog.update.enabled false
Join a Regional User Group to connect with local Databricks users. Events will be happening in your city, and you wonโt want to miss the chance to attend and share knowledge.
If there isnโt a group near you, start one and help create a community that brings people together.
Request a New Group