- 4828 Views
- 3 replies
- 2 kudos
I have created a Databricks workflow job with notebooks as individual tasks sequentially linked. I assign a value to a variable in one notebook task (ex: batchid = int(time.time()). Now, I want to pass this batchid variable to next notebook task.What...
- 4828 Views
- 3 replies
- 2 kudos
Latest Reply
@brickster You would use dbutils.jobs.taskValues.set() and dbutils.jobs.taskValues.get().See docs for more details: https://docs.databricks.com/workflows/jobs/share-task-context.html
2 More Replies
- 3770 Views
- 1 replies
- 0 kudos
Using AIrflow, I have created a DAG with a sequence of notebook tasks. The first notebook returns a batch id; the subsequent notebook tasks need this batch_id.I am using the DatabricksSubmitRunOperator to run the notebook task. This operator pushes ...
- 3770 Views
- 1 replies
- 0 kudos
Latest Reply
From what I understand - you want to pass a run_id parameter to the second notebook task?You can: Create a widget param inside your databricks notebook (https://docs.databricks.com/notebooks/widgets.html) that will consume your run_idPass the paramet...
by
rammy
• Contributor III
- 9855 Views
- 5 replies
- 5 kudos
I have tried following ways to get job parameters but none of the things are working.runId='{{run_id}}'
jobId='{{job_id}}'
filepath='{{filepath}}'
print(runId," ",jobId," ",filepath)
r1=dbutils.widgets.get('{{run_id}}')
f1=dbutils.widgets.get('{{file...
- 9855 Views
- 5 replies
- 5 kudos
Latest Reply
Thanks for your response. I found the solution. The below code gives me all the job parametersall_args = dbutils.notebook.entry_point.getCurrentBindings()print(all_args)Thanks for your support
4 More Replies
by
RJB
• New Contributor II
- 12227 Views
- 6 replies
- 0 kudos
I am trying to create a job which has 2 tasks as follows:A python task which accepts a date and an integer from the user and outputs a list of dates (say, a list of 5 dates in string format).A notebook which runs once for each of the dates from the d...
- 12227 Views
- 6 replies
- 0 kudos
Latest Reply
Just a note that this feature, Task Values, has been generally available for a while.
5 More Replies
- 17007 Views
- 4 replies
- 1 kudos
So. Since I would run a git_source as a notebook_task inside a databricks Job, I read that it's possible to forward to the notebook_task (and of course now to git_source) a bunch of parameters via the `base_parameters` field on Rest API.But, on my gi...
- 17007 Views
- 4 replies
- 1 kudos
Latest Reply
The way I was able to fix, was installing on my local dev environment `databricks-connect` as a pip library. This would emulate the whole databricks `dbutils` package, even if wouldn't work locally. But since I just needed to develop to have the func...
3 More Replies