cancel
Showing results for 
Search instead for 
Did you mean: 
Administration & Architecture
Explore discussions on Databricks administration, deployment strategies, and architectural best practices. Connect with administrators and architects to optimize your Databricks environment for performance, scalability, and security.
cancel
Showing results for 
Search instead for 
Did you mean: 

How to sending parameters from http request to in job running notebook

AmpolJon
New Contributor

I've try to trigger job running via n8n workflow, which can command to make notebook running properly.

BUT another bullet to achieve is I have to send some data to that job to be rub as well, i googled it and can't find solutions anywhere. My setup was

Method : POST
URL : https://abcd.databricks.com/api/2.2/jobs/run-now
json attached to request :
{
"job_id" : "569943055844936",
"job_parameters": {
"param1": "string1",
"param2": "string2"
},
"notebook_params": {
"input_path": "dbfs:/mnt/data/train.csv",
"num_epochs": "20"
}
}

In Python Notebook

    params = dbutils.widgets.getAll()
    print(params)
it return nothing
2 ACCEPTED SOLUTIONS

Accepted Solutions

szymon_dybczak
Esteemed Contributor III

Hi @AmpolJon ,

Passing parameters  is not supported by this API. That's why when you're defining notebook_params they'are ignored:
Trigger a new job run | Jobs API | REST API reference | Databricks on AWS

You can check following workaround though. The user is using update API to set task parameters and then executes the job. Not the most beautiful solution in the world but it should work:

https://stackoverflow.com/a/75607277

View solution in original post

@AmpolJon the job parameter should do the trick as @szymon_dybczak mentions. It's supported in the docs

BS_THE_ANALYST_0-1757600627169.png

Just be mindful that job parameters take precedence over task parameters. In your case though, task parameters won't work through the API, so configure your solution to leverage the job parameters 🙂.

Please feedback how you get on.

All the best,
BS


View solution in original post

6 REPLIES 6

szymon_dybczak
Esteemed Contributor III

Hi @AmpolJon ,

Passing parameters  is not supported by this API. That's why when you're defining notebook_params they'are ignored:
Trigger a new job run | Jobs API | REST API reference | Databricks on AWS

You can check following workaround though. The user is using update API to set task parameters and then executes the job. Not the most beautiful solution in the world but it should work:

https://stackoverflow.com/a/75607277

To be precise - passing parameters at task level is not supported by this API.

szymon_dybczak
Esteemed Contributor III

Or you can consider using job parameters instead of task parameter in this case 🙂

@AmpolJon the job parameter should do the trick as @szymon_dybczak mentions. It's supported in the docs

BS_THE_ANALYST_0-1757600627169.png

Just be mindful that job parameters take precedence over task parameters. In your case though, task parameters won't work through the API, so configure your solution to leverage the job parameters 🙂.

Please feedback how you get on.

All the best,
BS


AmpolJon
New Contributor

AmpolJon_0-1757646714265.png

I've try passing the job_parameters as
"job_parameters": {
  "param1": "string1",
  "param2": "string2"
}

and wishing to reach string1 in python using this code

param = dbutils.widgets.get("param1")

it return error as picture attached, i just want to assure that I should gave up on this method or I'm about in the right path?

BS_THE_ANALYST
Esteemed Contributor

@AmpolJon I don't think you should be giving up on the method, the API allows you to pass job parameters to it and you can retrieve them from in the Python Notebook. Here's an example.

1. Call on the API, https://docs.databricks.com/api/workspace/jobs/runnow 

BS_THE_ANALYST_0-1757845831113.png

 

2. Now, I take a look at the job, and check if the Parameters were passed through. Please note, I didn't set these up as part of the job, this was just through the API call. Also, I forgot to tick "Performance Optimized" in the job config so my job is taking a while longer to run 🙂.

BS_THE_ANALYST_1-1757845945322.png


3. Here's the run itself, and it errors, just like yours @AmpolJon 

BS_THE_ANALYST_2-1757846207717.png

4. Here's the solution:
Successful job run:

BS_THE_ANALYST_3-1757846378655.png

Notebook:

BS_THE_ANALYST_4-1757846396433.png

Code (the code comments explain the solution):

# Set a default value to make the widget exist. These will get overwritten anyway! :)
dbutils.widgets.text("param1", "default_value")
dbutils.widgets.text("param2", "default_value")

# Now you can access them (and they'll be overwritten)
p1 = dbutils.widgets.get("param1")
p2 = dbutils.widgets.get("param2")

print(f"param1={p1}, param2={p2}")


Hopefully that resolves your problem @AmpolJon . I'd love to hear more about how you're using N8N and Databricks together as part of your project. If you find the time, it'd be great to share in the Databricks community. I'd love to get some more project-based learning out there.

All the best,
BS

Join Us as a Local Community Builder!

Passionate about hosting events and connecting people? Help us grow a vibrant local community—sign up today to get started!

Sign Up Now