Hello,Enhanced autoscaling is available for Lakeflow Spark Declarative Pipelines by default and is enabled for Jobs Serverless; You can enable it in classic compute using the setting described here.
Hello,Databricks job cluster autoscaling makes decisions from Spark scheduler signals (pending/queued tasks vs available task slots and idleness windows), not raw CPU% alone. Enhanced autoscaling uses task queue size and task slot utilization. Autosc...
Hello, you can integrate Databricks job failures with ServiceNow using webhooks from Jobs.
In ServiceNow, create an inbound REST or Scripted REST API that takes the JSON payload and creates an incident.
In Databricks, edit the job, add a notificati...
Hello,The PAT is an authentication credential for your service principal; authorization is evaluated at request time based on the current permissions of that principal (and token permissions, if enabled), not the moment the token was created.So if yo...
Hello Amit,You can automate Unity Catalog permissions management using the Databricks Terraform provider instead of ad‑hoc scripts. With the databricks_grants resource you can declaratively manage privileges at the catalog, schema, table, and table/v...