โ01-24-2022 11:34 AM
I created a Databricks job with multiple tasks. Is there a way to pass variable values from one task to another. For example, if I have tasks A and B as Databricks notebooks. Can I create a variable (e.g. x) in notebook A and later use that value in notebook B?
โ01-25-2022 04:31 AM
โ01-25-2022 04:31 AM
โ01-25-2022 07:17 AM
Thank you, Werner! From my understanding, if I put a notebook in %run command, I won't have it as a separate task in the job. However, I'd also like to keep the ability to see it in the task view for transparency. Is there a way to achieve that?
โ01-25-2022 07:23 AM
hm I donยดt think so as actually the notebooks will run within the same task.
โ01-25-2022 07:26 AM
you could also consider using an orchestration tool like Data Factory (Azure) or Glue (AWS). there you can inject and use parameters from notebooks.
The job scheduling of databricks also has the possibility to add parameters, but I do not know if you can dynamically determine the input (based on another task).
โ01-25-2022 07:43 AM
Thank you! We're on Azure. Will explore Data Factory!
Join a Regional User Group to connect with local Databricks users. Events will be happening in your city, and you wonโt want to miss the chance to attend and share knowledge.
If there isnโt a group near you, start one and help create a community that brings people together.
Request a New Group