cancel
Showing results for 
Search instead for 
Did you mean: 
Data Engineering
cancel
Showing results for 
Search instead for 
Did you mean: 

Forum Posts

brickster
by New Contributor II
  • 2394 Views
  • 3 replies
  • 2 kudos

Passing values between notebook tasks in Workflow Jobs

I have created a Databricks workflow job with notebooks as individual tasks sequentially linked. I assign a value to a variable in one notebook task (ex: batchid = int(time.time()). Now, I want to pass this batchid variable to next notebook task.What...

  • 2394 Views
  • 3 replies
  • 2 kudos
Latest Reply
fijoy
Contributor
  • 2 kudos

@brickster You would use dbutils.jobs.taskValues.set() and dbutils.jobs.taskValues.get().See docs for more details: https://docs.databricks.com/workflows/jobs/share-task-context.html

  • 2 kudos
2 More Replies
Choolanadu
by New Contributor
  • 2100 Views
  • 1 replies
  • 0 kudos

Airflow - How to pull XComs value in the notebook task?

Using AIrflow, I have created a DAG with a sequence of notebook tasks. The first notebook returns a batch id; the subsequent notebook tasks need this batch_id.I am using the DatabricksSubmitRunOperator to run the notebook task. This operator pushes ...

  • 2100 Views
  • 1 replies
  • 0 kudos
Latest Reply
daniel_sahal
Esteemed Contributor
  • 0 kudos

From what I understand - you want to pass a run_id parameter to the second notebook task?You can: Create a widget param inside your databricks notebook (https://docs.databricks.com/notebooks/widgets.html) that will consume your run_idPass the paramet...

  • 0 kudos
rammy
by Contributor III
  • 5016 Views
  • 6 replies
  • 5 kudos

How I could read the Job id, run id and parameters in python cell?

I have tried following ways to get job parameters but none of the things are working.runId='{{run_id}}' jobId='{{job_id}}' filepath='{{filepath}}' print(runId," ",jobId," ",filepath) r1=dbutils.widgets.get('{{run_id}}') f1=dbutils.widgets.get('{{file...

  • 5016 Views
  • 6 replies
  • 5 kudos
Latest Reply
rammy
Contributor III
  • 5 kudos

Thanks for your response. I found the solution. The below code gives me all the job parametersall_args = dbutils.notebook.entry_point.getCurrentBindings()print(all_args)Thanks for your support

  • 5 kudos
5 More Replies
RJB
by New Contributor II
  • 7533 Views
  • 6 replies
  • 0 kudos

Resolved! How to pass outputs from a python task to a notebook task

I am trying to create a job which has 2 tasks as follows:A python task which accepts a date and an integer from the user and outputs a list of dates (say, a list of 5 dates in string format).A notebook which runs once for each of the dates from the d...

  • 7533 Views
  • 6 replies
  • 0 kudos
Latest Reply
BilalAslamDbrx
Honored Contributor II
  • 0 kudos

Just a note that this feature, Task Values, has been generally available for a while.

  • 0 kudos
5 More Replies
antoniodavideca
by New Contributor III
  • 8000 Views
  • 4 replies
  • 1 kudos

Resolved! install dbutils locally

So. Since I would run a git_source as a notebook_task inside a databricks Job, I read that it's possible to forward to the notebook_task (and of course now to git_source) a bunch of parameters via the `base_parameters` field on Rest API.But, on my gi...

  • 8000 Views
  • 4 replies
  • 1 kudos
Latest Reply
antoniodavideca
New Contributor III
  • 1 kudos

The way I was able to fix, was installing on my local dev environment `databricks-connect` as a pip library. This would emulate the whole databricks `dbutils` package, even if wouldn't work locally. But since I just needed to develop to have the func...

  • 1 kudos
3 More Replies
Labels