โ10-11-2024 03:10 AM
Hello! I have been using the databricks rest api for running workflows using this endpoint: /api/2.1/jobs/run-now. But now i wanted to also include job_parameters in my api call, i have put job parameters inside my workflow: param1, param2, and in my call to the api im sending this:
params ={"job_id" : 12312312
"job_parameters":{"param1":"12",
"param2":"test"},}. But for some reason i get following response: {'error_code': 'MALFORMED_REQUEST', 'message': "Could not parse request object: Expected 'START_OBJECT' not 'VALUE_STRING'\n at....
Anyone knows what the problem may be? Ive also just copied the sample request as specified in documentation:https://docs.databricks.com/api/azure/workspace/jobs/runnow#job_parameters but its not working either.
โ10-11-2024 08:03 AM
Hi @ErikJ ,
Are you sending a POST request to the /api/2.1/jobs/run-now endpoint? It's important to use a POST request because this endpoint expects the parameters to be included in the request body as JSON, not as query parameters.
Also, your "params" seem malformed, with a missing comma after "job_id": 12312312 and an extra comma after "job_parameters". This might just be due to copy-pasting, but incorrect JSON formatting can cause the MALFORMED_REQUEST error you're seeing.
Here's how your JSON body should look:
{
"job_id" : 12312312,
"job_parameters":
{
"param1":"12",
"param2":"test"
}
}
Additional Notes:
โ10-14-2024 03:42 AM
{
"job_id": ,
"job_parameters": {
"params1": "",
"params2": ""
}
}
Yes my bad maybe, but it is correctly formatted, and im using POST exactly as described in the databricks rest api documentation. I even copied their example json but that didn't help. I just know that, i can run the job using api, but as soon as i add parameters inside "job_parameters" i get this in return. But I will look into it today!
โ10-18-2024 12:37 PM
I just had the same error. The cause of the problem for me was that I didn't have depends_on formatted corretly. This may not be the same problem for you, but it might be worth checking similar things. The good news is that it won't spoil your adventure with any unwanted hints in the error message.
I was using: "depends_on": ["A"]
It was fixed by switching to: "depends_on": [{"task_key":"A"}]
โ10-21-2024 05:39 AM - edited โ10-21-2024 05:40 AM
@ErikJ I ran into a similar issue beforeโit was like calling the wrong twin by their siblingโs name! Instead of using job_parameters, I accidentally charmed my way through with notebook_params, and voila, it worked:
params = {
"job_id": 12312312,
"notebook_params": {
"param1": "12",
"param2": "test"
}
}
So, lesson learned: If something isnโt working, maybe itโs not a bugโitโs just your parameters playing hide-and-seek. Always good to try the other twin!
โ10-21-2024 05:59 AM
โ10-21-2024 06:06 AM
Hello and thank you for the replies! I found the issue. For some reason I had params = params in my requests.post(), but I swapped to json = params like this:
Join a Regional User Group to connect with local Databricks users. Events will be happening in your city, and you wonโt want to miss the chance to attend and share knowledge.
If there isnโt a group near you, start one and help create a community that brings people together.
Request a New Group