01-23-2023 03:31 PM
02-01-2023 02:20 PM
Hi guys, I could solve the problem. I got in contact with the support team , they explained to me that the jobs was moved because they reorganize the clusters with the last update.
01-23-2023 09:46 PM
Hi @Esaú Espinoza in my knowledge, there is no option to recover the deleted job in databricks,
Meanwhile, you can raise a support request for the same.
01-24-2023 07:57 AM
don't know if this applies to your case, but clusters are deleted after a certain period when they are not used. The way to prevent this is to pin them.
01-30-2023 01:54 PM
Are you using the community version? You can open a support ticket and provide the cluster name and the time range when you used it.
02-01-2023 02:20 PM
Hi guys, I could solve the problem. I got in contact with the support team , they explained to me that the jobs was moved because they reorganize the clusters with the last update.
12-21-2023 03:22 AM
Dear folks,
When the tables has been deleted, then why I am unable to create the table with same name.
It continiously giving me error
"DeltaAnalysisException: Cannot create table ('`spark_catalog`.`default`.`Customer_Data`'). The associated location ('dbfs:/user/hive/warehouse/customer_data') is not empty and also not a Delta table."
Passionate about hosting events and connecting people? Help us grow a vibrant local community—sign up today to get started!
Sign Up Now