- 2577 Views
- 1 replies
- 4 kudos
Hi DataBricks Experts:I'm using Databricks on Azure.... I'd like to understand the following:1) if there is way of automating the re run some specific failed tasks from a job (with several Tasks), for example if I have 4 tasks, and the task 1 and 2 h...
- 2577 Views
- 1 replies
- 4 kudos
Latest Reply
You can use "retries".In Workflow, select your job, the task, and in the options below, configure retries.If so, you can also see more options at:https://learn.microsoft.com/pt-br/azure/databricks/dev-tools/api/2.0/jobs?source=recommendations
by
bradm0
• New Contributor III
- 1258 Views
- 3 replies
- 3 kudos
I'm trying to use the badRecordsPath to catch improperly formed records in a CSV file and continue loading the remainder of the file. I can get the option to work using python like thisdf = spark.read\
.format("csv")\
.option("header","true")\
.op...
- 1258 Views
- 3 replies
- 3 kudos
Latest Reply
Thanks. It was the inferSchema setting. I tried it with and without the SELECT and it worked both ways when I added inferSchemaBoth of these workeddrop table my_db.t2;
create table my_db.t2 (col1 int,col2 int);
copy into my_db.t2
from (SELECT cast(...
2 More Replies
- 5162 Views
- 4 replies
- 6 kudos
What is a common practice to to write notebook which includes error handling/exception handling.Is there any example which depicts how notebook should be written to include error handling etc.
- 5162 Views
- 4 replies
- 6 kudos
Latest Reply
runtime looks for handlers (try-catch) that are registered to handle such exceptions
3 More Replies