by
Ludo
• New Contributor III
- 7079 Views
- 7 replies
- 2 kudos
Hello,This is question on our platform with `Databricks Runtime 11.3 LTS`.I'm running a Job with multiple tasks in // using a shared cluster.Each task runs a dedicated scala class within a JAR library attached as a dependency.One of the task fails (c...
- 7079 Views
- 7 replies
- 2 kudos
Latest Reply
Hi,This actually should not be marked as solved. We are having the same problem, whenever a Shared Job Cluster crashes for some reason (generally OoM), all tasks will start failing until eternity, with the error message as described above. This is ac...
6 More Replies
- 5982 Views
- 1 replies
- 2 kudos
My use-case is to process a dataset worth 100s of partitions in concurrency. The data is partitioned, and they are disjointed. I was facing ConcurrentAppendException due to S3 not supporting the “put-if-absent” consistency guarantee. From Delta Lake ...
- 5982 Views
- 1 replies
- 2 kudos
Latest Reply
Hi, You can refer to https://docs.databricks.com/optimizations/isolation-level.html#conflict-exceptions and recheck if everything is alright. Please let us know if this helps, also please tag @Debayan with your next response which will notify me, Th...
- 11640 Views
- 1 replies
- 4 kudos
The documentation explains how to use multicursor in notebooks. However, it only says it for Windows and MacOS. The Windows way would work in Linux (Ubuntu) up to a few days ago but it does not work now anymore.
- 11640 Views
- 1 replies
- 4 kudos
Latest Reply
@Davide Cagnoni :Multicursor support in Databricks notebooks is implemented using the Ace editor, which is a web-based code editor. Therefore, the behavior of multicursor support may depend on the specific browser and operating system you are using....
- 961 Views
- 0 replies
- 1 kudos
I have a multi-task job that runs everyday where the first notebook in the job checks if the run should be continued based on the date that the job is run. The majority of the time the answer to that is no and I'm raising an exception for the job to ...
- 961 Views
- 0 replies
- 1 kudos
- 1497 Views
- 0 replies
- 0 kudos
I am trying to run an incremental data processing job using python wheel.The job is scheduled to run e.g. every hour.For my code to know what data increment to process, I inject it with the {{start_time}} as part of the command line, like so["end_dat...
- 1497 Views
- 0 replies
- 0 kudos