- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
02-14-2025 05:35 AM
Hello,
I have a case where I am executing notebooks from an external system using databricks api /api/2.2/jobs/runs/submit. This has always been non problematic with the job compute, but due to the quite recent serverless for notebooks support being introduced, it was decided to switch to that.
The issue I have is the limitations within serverless environment version 1, as it does not support schema evolution in merge operations, last week serverless environment version 2 was introduced which supports the withSchemaEvolution method of the DeltaMergeBuilder class so it solves the issue i was facing initially, but it introduces a new one as the only way I am able to change the version is through the UI on a notebook level, however once databricks content gets freshly deployed to the workspace, the value is again set to 1, so I don't want to go and everytime after deployment change the value from 1 to 2.
I tried in the api call defining the environment as per documentation and setting the client value to "2":
Also tried to then reference the environment_key defined in environments in the task section, but that returns an error that environments can't be passed to notebook tasks.
My question is if there is any way to programatically specify in the api call that I want to use serverless environment version 2 or somehow else make the version 2 default on workspace level for all notebooks?
Accepted Solutions
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
4 weeks ago
Hi @tts,
Thanks for following up.. I noticed that serverless version 2 is now default version, are you still hitting the failure?
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
02-15-2025 04:40 PM
Hello @tts,
there is no way to programmatically specify the version of the serverless environment for notebook tasks in the Databricks API call /api/2.2/jobs/runs/submit
. Currently, serverless environment versions must be set at the notebook level through the UI, and this setting gets reset to version 1 every time Databricks content is deployed to the workspace. I will check internally if there is a workaround for now.
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
a month ago
Hello @Alberto_Umana,
Thank you for the response. Let me know if you find any way to handle this or if this is atleast something that will be available in the future.
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
4 weeks ago
Hi @Alberto_Umana,
Has there been any updates regarding this internally?
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
4 weeks ago
Hi @tts,
Thanks for following up.. I noticed that serverless version 2 is now default version, are you still hitting the failure?
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
4 weeks ago
Did not notice that, but you are right, thanks for the follow and for your help.
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
4 weeks ago
As an alternative environment for Serverless could be set in asset bundle job configuration.
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
3 weeks ago
This only applies to Python scripts, Python wheesl and dbt tasks, but in my case I want to have possibility to set it for notebook tasks as well. Currently, when notebooks are deployed with asset bundles, the value still gets set to 1. If I create a new notebook in the workspace, the environment version is then 2. So what I struggle to find is how to set the value for notebook tasks or if that is even possible at the moment.

