Hi @smpa01, make sure the API token or authentication you're using is linked to the account that has the necessary permissions on the job. You can also try running the job directly from the Databricks UI, if it works there but not via the API, it ind...
Hi @jakobhaggstrom, this error likely occurs due to the type of secret you're using. For M2M authentication, the Databricks JDBC driver requires a Databricks generated OAuth secret, not a Microsoft Entra ID client secret. While your service principal...
Hi @KIRKQUINBAR, if you enable Predictive Optimization at the metastore level in Unity Catalog, it automatically applies to all Unity Catalog managed tables within that metastore, no matter which workspace is accessing them. PO runs centrally, so the...
Hi @EllaClark, Yes, you can automate tagging of Databricks notebooks based on folder structure using the REST API and a script. Use the Workspace API to list notebook paths, extract folder names, and treat them as tags.If the API supports metadata up...
Hi @Kabi, as of my knowledge databricks doesn’t support directly connecting to Databricks kernel. However, here are practical ways to sync your local notebook with Databricks:You can use Git to version control your notebooks. Clone your repo into Dat...