Wednesday
I am trying to follow your documentation on how to create serverless job via API https://docs.databricks.com/api/workspace/jobs/create#environments-spec-environment_version So i see that sending the json request resulted for me to see serverless cluster for my test job in Databricks UI. However even though i specify the environment or dependencies in my json, they seem to be completely ignored. By default i get serverless environement 3. Can you give me a hint of what is going wrong in this case?
Here is my json
{
"environments": [
{
"environment_key": "default_python",
"spec": {
"environment_version": "1",
"dependencies": [
"/Volumes/<mycatalog>/..."
]
}
}
],
"name": "${{ parameters.TEST_RUN_TITLE }}",
"tasks": [
{
"task_key": "pytest_task",
"notebook_task": {
"notebook_path": "${{ parameters.NOTEBOOK_PATH }}",
"base_parameters": {
"build_id": "$(Build.BuildId)"
}
}
}
]
}
Wednesday
Hi @anazen13 ,
Are you using databricks cli to create serveless job or are you calling REST API directly?
Thursday
yeah i define the json based on the doc link provided in the original message and then use cli to create a new job.
JOB_RESPONSE=$(databricks jobs create --json "$JOB_JSON")
JOB_ID=$(echo "$JOB_RESPONSE" | jq -r '.job_id')
Thursday
Ok, thanks for info @anazen13 . If I were you I would double check that you're using latest version of databricks cli. Your payload looks correct, but if you're using old version of the cli, new environment_version attribute could not be visible.
If that doesnโt help, then itโs possible Iโve found a bug. In that case, you can report it to Databricks support.
Thursday
hey, thanks for responding quickly. The version i use Databricks CLI v0.265.0. I guess should be good enough cause I see the most recent v0.266.0 released just yesterday.
Thursday
So, I've tested it on my environment and it worked as expected. But I've used the newest version of cli.
So, here's my payload:
Here's cli command I've used:
databricks jobs create --json @payload.json
And here's an outcome. New Job was created with specification I defined in my json payload:
Thursday
I just tried by updating my json to match yours as much as possible but still no success ๐
I think I will report this issue anyway to their support to debug further what's not okay in my case. But thanks so much for your time.
Thursday
Sure, no problem. Let us know about the outcome ๐
Wednesday
It looks like youโre hitting one of the current limitations of Databricks serverless jobs. Even though the API supports passing an environments object, only certain fields are honored right now. In particular:
So in your example, the JSON is valid, but the fields youโre setting are ignored because serverless doesnโt currently allow custom environment pinning or inline dependency injection.
If you need strict control over environment/dependencies, youโll want to either:
Iโd also keep an eye on the Jobs API release notes, since Databricks has been expanding serverless features fairly quickly.
Thursday
Hi siennafaleiro, thanks for you reply and you might be right, but then I find it strange that they list the entire json in the official docs as an example and then some of it worked and some fields are ignored.
Passionate about hosting events and connecting people? Help us grow a vibrant local communityโsign up today to get started!
Sign Up Now