a month ago
I am working on deploying a Databricks job to the production environment using a PowerShell script in Azure DevOps release pipeline. The task requires to update the job configuration JSON file to set the job's compute to serverless. For this, I need to set the `existing_cluster_id` field in the job config. Is there a way to programmatically retrieve the dynamic serverless cluster ID so that I can use it to update the `existing_cluster_id` field in the JSON job configuration?Here's the PowerShell code I am currently working with:
$serverlessId = "xxxx-xxxxxx-xxxxxxxx-xxx"
$workspaceBasePath = "/path/toWorkspace/"
$JobPath = "$(ARTIFACT_PATH)\DatabricksBundle\jobs\job-settings.json"
function Import-Job {
param (
[Parameter(Mandatory = $true)]
[string]$jsonPath
)
if (-not (Test-Path $jsonPath)) {
Write-Output "Error: The job settings file does not exist at the path: $jsonPath"
throw "File not found"
}
try {
$jsonContent = Get-Content -Path $jsonPath | ConvertFrom-Json
$jsonContent.PSObject.Properties.Remove("existing_cluster_id"
$jsonContent | Add-Member -MemberType NoteProperty -Name "existing_cluster_id" -Value $(serverlessId)
$jsonPayload = $jsonContent | ConvertTo-Json -Depth 10
Write-Output "job-config-settings content: $jsonPayload"
$response = Invoke-RestMethod -Uri "$databricksUrl/api/2.1/jobs/create" -Method Post -Headers $headers -Body $jsonPayload -ContentType "application/json"
Write-Output "Job created successfully: $($response.job_id)"
} catch {
Write-Output "Error creating job: $_"
}
}
Import-Job -jsonPath $JobPath
Any suggestions or examples on how to get the serverless cluster ID in PowerShell would be greatly appreciated!
3 weeks ago
As per docs, the single task mode has been deprecated, so you will need to use the multi task method as of now https://docs.databricks.com/api/workspace/jobs/create#format
a month ago
Is this field completely required? Based on API job deployment, by not setting any details for cluster on the job the system will set it automatically as Serverless, this as for jobs the serverless will not be the same on each run.
3 weeks ago
Thank you. It worked with multi-task jobs. I was trying with a single task job, and it looks like it requires this id, right? Is there a way to deploy single task jobs in the serverless compute
3 weeks ago
As per docs, the single task mode has been deprecated, so you will need to use the multi task method as of now https://docs.databricks.com/api/workspace/jobs/create#format
3 weeks ago
Got it! Thanks for the info.
3 weeks ago
sure, happy to assist, if any of my responses was able to help you would really appreciate if you can accept it as a solution
Join a Regional User Group to connect with local Databricks users. Events will be happening in your city, and you won’t want to miss the chance to attend and share knowledge.
If there isn’t a group near you, start one and help create a community that brings people together.
Request a New Group