01-08-2025 08:27 PM
I am working on deploying a Databricks job to the production environment using a PowerShell script in Azure DevOps release pipeline. The task requires to update the job configuration JSON file to set the job's compute to serverless. For this, I need to set the `existing_cluster_id` field in the job config. Is there a way to programmatically retrieve the dynamic serverless cluster ID so that I can use it to update the `existing_cluster_id` field in the JSON job configuration?Here's the PowerShell code I am currently working with:
$serverlessId = "xxxx-xxxxxx-xxxxxxxx-xxx"
$workspaceBasePath = "/path/toWorkspace/"
$JobPath = "$(ARTIFACT_PATH)\DatabricksBundle\jobs\job-settings.json"
function Import-Job {
param (
[Parameter(Mandatory = $true)]
[string]$jsonPath
)
if (-not (Test-Path $jsonPath)) {
Write-Output "Error: The job settings file does not exist at the path: $jsonPath"
throw "File not found"
}
try {
$jsonContent = Get-Content -Path $jsonPath | ConvertFrom-Json
$jsonContent.PSObject.Properties.Remove("existing_cluster_id"
$jsonContent | Add-Member -MemberType NoteProperty -Name "existing_cluster_id" -Value $(serverlessId)
$jsonPayload = $jsonContent | ConvertTo-Json -Depth 10
Write-Output "job-config-settings content: $jsonPayload"
$response = Invoke-RestMethod -Uri "$databricksUrl/api/2.1/jobs/create" -Method Post -Headers $headers -Body $jsonPayload -ContentType "application/json"
Write-Output "Job created successfully: $($response.job_id)"
} catch {
Write-Output "Error creating job: $_"
}
}
Import-Job -jsonPath $JobPath
Any suggestions or examples on how to get the serverless cluster ID in PowerShell would be greatly appreciated!
01-15-2025 06:31 AM
As per docs, the single task mode has been deprecated, so you will need to use the multi task method as of now https://docs.databricks.com/api/workspace/jobs/create#format
01-09-2025 02:03 AM
Is this field completely required? Based on API job deployment, by not setting any details for cluster on the job the system will set it automatically as Serverless, this as for jobs the serverless will not be the same on each run.
01-14-2025 09:58 PM
Thank you. It worked with multi-task jobs. I was trying with a single task job, and it looks like it requires this id, right? Is there a way to deploy single task jobs in the serverless compute
01-15-2025 06:31 AM
As per docs, the single task mode has been deprecated, so you will need to use the multi task method as of now https://docs.databricks.com/api/workspace/jobs/create#format
01-15-2025 07:32 AM
Got it! Thanks for the info.
01-15-2025 08:14 AM
sure, happy to assist, if any of my responses was able to help you would really appreciate if you can accept it as a solution
Passionate about hosting events and connecting people? Help us grow a vibrant local community—sign up today to get started!
Sign Up Now