- 6104 Views
- 7 replies
- 8 kudos
Is there any way to propagate errors from dbutils?
I have a master notebook that runs a few different notebooks on a schedule using the dbutils.notebook.run() function. Occasionally, these child notebooks will fail (due to API connections or whatever). My issue is, when I attempt to catch the errors ...
- 6104 Views
- 7 replies
- 8 kudos
- 8 kudos
I have the same issue. I see no reason that Databricks couldn't propagate the internal exception back through their WorkflowException
- 8 kudos
- 1876 Views
- 3 replies
- 1 kudos
databricks multistage jobs in workflow Params not passing
I am using a multi-stage job calling different notebooks all have the same PARAMNAME that needs to be passed in. one the second and third job, I input the new a different PARAM's value .. but those values do not show up when it runs the task. I...
- 1876 Views
- 3 replies
- 1 kudos
- 1 kudos
Hi @David Byrd​ this is already a known thing and we have raised it to our engineering team. If you have the same key but different values in the parameters, then its most likely takes the first value for the key and will use the same for all the tas...
- 1 kudos
- 1588 Views
- 1 replies
- 0 kudos
Failed to start cluster.
Hello,i'm getting this message today everytime i try to run a notebook since i logged in:When i click 'ok' nothing happens.I tried refreshing and switching browsers already. Tried run this cell, run below cells, run all cells, etc. I also tried to cr...
- 1588 Views
- 1 replies
- 0 kudos
- 0 kudos
- 0 kudos
- 564 Views
- 0 replies
- 1 kudos
Hi folks, I am trying a particular use case, where I need to schedule a run of three different notebooks(pyspark,sql code) in sequence ​I need to use ...
Hi folks, I am trying a particular use case, where I need to schedule a run of three different notebooks(pyspark,sql code) in sequence ​I need to use date field as a common parameter in all three (date is part of the sql query in each nb where clause...
- 564 Views
- 0 replies
- 1 kudos