- 1030 Views
- 1 replies
- 3 kudos
can we connect delta table of databricks from one workspace to another in different subscription and run vacuum command?
- 1030 Views
- 1 replies
- 3 kudos
Latest Reply
Hi @ASHUTOSH YADAV Great to meet you, and thanks for your question! Let's see if your peers in the community have an answer to your question. Thanks.
- 10196 Views
- 3 replies
- 6 kudos
I have some Delta tables in our dev environment that started popping up with the following error today:py4j.protocol.Py4JJavaError: An error occurred while calling o670.execute.
: org.apache.spark.SparkException: Job aborted due to stage failure: Tas...
- 10196 Views
- 3 replies
- 6 kudos
Latest Reply
Hi @Jordan Yaker We haven't heard from you since the last response from @Kaniz Fatma , and I was checking back to see if her suggestions helped you.Or else, If you have any solution, please share it with the community, as it can be helpful to other...
2 More Replies
by
Nis
• New Contributor II
- 1751 Views
- 1 replies
- 2 kudos
I have a delta table whose size will increases gradually now we have around 1.5 crores of rows while running vacuum command on that table i am getting the below error.ERROR: Job aborted due to stage failure: Task 7 in stage 491.0 failed 4 times, most...
- 1751 Views
- 1 replies
- 2 kudos
Latest Reply
Do you have access to the Executor 7 logs? is there a high GC or some other events that is making the heartbeat timeout? would you be able to check the failed stages?
by
AP
• New Contributor III
- 4604 Views
- 5 replies
- 3 kudos
So databricks gives us great toolkit in the form optimization and vacuum. But, in terms of operationaling them, I am really confused on the best practice.Should we enable "optimized writes" by setting the following at a workspace level?spark.conf.set...
- 4604 Views
- 5 replies
- 3 kudos
Latest Reply
@AKSHAY PALLERLA Just checking in to see if you got a solution to the issue you shared above. Let us know!Thanks to @Werner Stinckens for jumping in, as always!
4 More Replies
- 1558 Views
- 1 replies
- 0 kudos
My VACCUM command is stuck. I am not sure if it's deleting any files.
- 1558 Views
- 1 replies
- 0 kudos
Latest Reply
There is no direct way to track the progress of the VACUUM command. One easy workaround is to run a DRY RUN from another notebook which will give the estimate of files to be deleted at that point in time. This will give a rough estimate of files to b...
- 1712 Views
- 1 replies
- 0 kudos
For the optimize command, I can give predicates and it's easy to optimize the partitions where the data is added. Similarly, can I specify the "WHERE" clause on the partition for a VACUUM command
- 1712 Views
- 1 replies
- 0 kudos
Latest Reply
It's by design, VACUUM command does not support filters on the partition columns. This is because removing the old files partially can leave can impact the time travel feature.