06-05-2025 03:12 AM
Hi,
I have a set of notebooks which configure new catalogs, set permissions, create default schemas, attach Azure Storage accounts as external volumes, create Git Folders and set current branches, etc.
All this works just fine.
One thing I'm trying to add is automated binding of a catalog to specific workspace(s).
I have my SQL cells creating the catalog and setting the permissions working.
Then, I have a %pip cell which installs the databricks-sdk and restarts python.
That works without error.
But then I have a python cell with the following:
_ = w.catalogs.update(name=newCatalogName, isolation_mode=CatalogIsolationMode.ISOLATED)
This raises the error:
PermissionDenied: Unauthorized token type db-internal to call UpdateCatalog. Config: host=https://ukwest.azuredatabricks.net, azure_tenant_id=**REDACTED**, auth_type=runtimeThe odd thing is that I can use the WorkspaceClient for other operations (e.g. creating workspace items) absolutely fine without error.
Any help greatly appreciated!
For reference:
10-07-2025 05:08 AM
The error occurs because the Databricks Python SDK (databricks-sdk) and the authentication method within an Azure Databricks notebook use a special “db-internal” token for user-based notebook execution, which does not have permission to perform some sensitive Unity Catalog (UC) actions, specifically API calls that manage catalog isolation or binding (e.g., UpdateCatalog isolation mode) .
The built-in notebook credentials (“db-internal” runtime token) have limited scope and are not permitted for sensitive back-end UC operations, even if the calling user is an admin or catalog owner .
This limitation is specific to certain Unity Catalog API calls like catalog isolation, not generic catalog actions (e.g., permissions, creation) which sometimes succeed with notebook tokens.
Direct REST or SDK calls for catalog binding must use a personal access token (PAT) or service principal, not the runtime token used inside notebooks.
Generate a PAT from your user settings in Databricks and use this for the Python SDK authentication, instead of default notebook-based credentials:
from databricks.sdk import WorkspaceClient
w = WorkspaceClient(
host="https://<your-instance>.azuredatabricks.net",
token="<your-personal-access-token>"
)
_ = w.catalogs.update(name=newCatalogName, isolation_mode=CatalogIsolationMode.ISOLATED)
Use this in your script, or securely load the token via environment variables (recommended for security) .
Perform the binding or catalog update steps using a Python script or automation running outside Databricks notebooks (e.g., in CI/CD pipelines, locally), where you can use true PAT or service principal authentication rather than the internal notebook tokens .
Double-check catalog permissions in Unity Catalog: you need MODIFY and ownership permissions, but even with those, notebook/backed SSO tokens lack the API-level privilege enforced by Databricks security model .
The restriction applies regardless of personal/shared cluster, because it’s about token source, not cluster type.
SSO/brokered credentials are translated into a db-internal token for notebook/interpreter execution; you cannot elevate this to a proper PAT within the same notebook session.
| API Action | Notebook "db-internal" Token | PAT/Service Principal Token |
|---|---|---|
| Create catalog, schema, grants | Works | Works |
| UpdateCatalog (isolation binding) | Fails (PermissionDenied) | Works |
: Databricks KB: “Unauthorized token type db-internal” error when calling the REST API from a notebook
: Databricks Unity Catalog API authentication requirements
In conclusion, replace notebook-internal SDK authentication with a PAT or service principal for these catalog API operations, running them outside the notebook, or via a %python step using the proper PAT if you must automate within Databricks workflows .
10-07-2025 05:08 AM
The error occurs because the Databricks Python SDK (databricks-sdk) and the authentication method within an Azure Databricks notebook use a special “db-internal” token for user-based notebook execution, which does not have permission to perform some sensitive Unity Catalog (UC) actions, specifically API calls that manage catalog isolation or binding (e.g., UpdateCatalog isolation mode) .
The built-in notebook credentials (“db-internal” runtime token) have limited scope and are not permitted for sensitive back-end UC operations, even if the calling user is an admin or catalog owner .
This limitation is specific to certain Unity Catalog API calls like catalog isolation, not generic catalog actions (e.g., permissions, creation) which sometimes succeed with notebook tokens.
Direct REST or SDK calls for catalog binding must use a personal access token (PAT) or service principal, not the runtime token used inside notebooks.
Generate a PAT from your user settings in Databricks and use this for the Python SDK authentication, instead of default notebook-based credentials:
from databricks.sdk import WorkspaceClient
w = WorkspaceClient(
host="https://<your-instance>.azuredatabricks.net",
token="<your-personal-access-token>"
)
_ = w.catalogs.update(name=newCatalogName, isolation_mode=CatalogIsolationMode.ISOLATED)
Use this in your script, or securely load the token via environment variables (recommended for security) .
Perform the binding or catalog update steps using a Python script or automation running outside Databricks notebooks (e.g., in CI/CD pipelines, locally), where you can use true PAT or service principal authentication rather than the internal notebook tokens .
Double-check catalog permissions in Unity Catalog: you need MODIFY and ownership permissions, but even with those, notebook/backed SSO tokens lack the API-level privilege enforced by Databricks security model .
The restriction applies regardless of personal/shared cluster, because it’s about token source, not cluster type.
SSO/brokered credentials are translated into a db-internal token for notebook/interpreter execution; you cannot elevate this to a proper PAT within the same notebook session.
| API Action | Notebook "db-internal" Token | PAT/Service Principal Token |
|---|---|---|
| Create catalog, schema, grants | Works | Works |
| UpdateCatalog (isolation binding) | Fails (PermissionDenied) | Works |
: Databricks KB: “Unauthorized token type db-internal” error when calling the REST API from a notebook
: Databricks Unity Catalog API authentication requirements
In conclusion, replace notebook-internal SDK authentication with a PAT or service principal for these catalog API operations, running them outside the notebook, or via a %python step using the proper PAT if you must automate within Databricks workflows .
Passionate about hosting events and connecting people? Help us grow a vibrant local community—sign up today to get started!
Sign Up Now