cancel
Showing results for 
Search instead for 
Did you mean: 
Administration & Architecture
Explore discussions on Databricks administration, deployment strategies, and architectural best practices. Connect with administrators and architects to optimize your Databricks environment for performance, scalability, and security.
cancel
Showing results for 
Search instead for 
Did you mean: 

Setting catalog isolation mode and workspace bindings within a notebook using Python SDK

m2chrisp
New Contributor II

Hi,

I have a set of notebooks which configure new catalogs, set permissions, create default schemas, attach Azure Storage accounts as external volumes, create Git Folders and set current branches, etc.

All this works just fine.

One thing I'm trying to add is automated binding of a catalog to specific workspace(s).

I have my SQL cells creating the catalog and setting the permissions working.

Then, I have a %pip cell which installs the databricks-sdk and restarts python.

That works without error.

But then I have a python cell with the following:

_ = w.catalogs.update(name=newCatalogName, isolation_mode=CatalogIsolationMode.ISOLATED)

This raises the error:

PermissionDenied: Unauthorized token type db-internal to call UpdateCatalog. Config: host=https://ukwest.azuredatabricks.net, azure_tenant_id=**REDACTED**, auth_type=runtime

The odd thing is that I can use the WorkspaceClient for other operations (e.g. creating workspace items) absolutely fine without error.

Any help greatly appreciated!

For reference:

  • My user is in the admins group for the workspace
  • I have full permissions (including MODIFY) on the target catalog
  • This is using Azure Databricks with SSO user authentication.
  • The cluster is a Personal cluster.
1 ACCEPTED SOLUTION

Accepted Solutions

mark_ott
Databricks Employee
Databricks Employee

The error occurs because the Databricks Python SDK (databricks-sdk) and the authentication method within an Azure Databricks notebook use a special “db-internal” token for user-based notebook execution, which does not have permission to perform some sensitive Unity Catalog (UC) actions, specifically API calls that manage catalog isolation or binding (e.g., UpdateCatalog isolation mode) .

Why You’re Hitting the Error

  • The built-in notebook credentials (“db-internal” runtime token) have limited scope and are not permitted for sensitive back-end UC operations, even if the calling user is an admin or catalog owner .

  • This limitation is specific to certain Unity Catalog API calls like catalog isolation, not generic catalog actions (e.g., permissions, creation) which sometimes succeed with notebook tokens.

  • Direct REST or SDK calls for catalog binding must use a personal access token (PAT) or service principal, not the runtime token used inside notebooks.

Workarounds and Solutions

1. Use a Personal Access Token (PAT) or Service Principal

Generate a PAT from your user settings in Databricks and use this for the Python SDK authentication, instead of default notebook-based credentials:

python
from databricks.sdk import WorkspaceClient w = WorkspaceClient( host="https://<your-instance>.azuredatabricks.net", token="<your-personal-access-token>" ) _ = w.catalogs.update(name=newCatalogName, isolation_mode=CatalogIsolationMode.ISOLATED)
  • Use this in your script, or securely load the token via environment variables (recommended for security) .

2. Run Sensitive Tasks Outside Notebooks

  • Perform the binding or catalog update steps using a Python script or automation running outside Databricks notebooks (e.g., in CI/CD pipelines, locally), where you can use true PAT or service principal authentication rather than the internal notebook tokens .

3. API Permissions and Catalog Ownership

  • Double-check catalog permissions in Unity Catalog: you need MODIFY and ownership permissions, but even with those, notebook/backed SSO tokens lack the API-level privilege enforced by Databricks security model .

4. Cluster Type

  • The restriction applies regardless of personal/shared cluster, because it’s about token source, not cluster type.

5. Azure SSO

  • SSO/brokered credentials are translated into a db-internal token for notebook/interpreter execution; you cannot elevate this to a proper PAT within the same notebook session.

Summary Table

API Action Notebook "db-internal" Token PAT/Service Principal Token
Create catalog, schema, grants Works Works
UpdateCatalog (isolation binding) Fails (PermissionDenied) Works
 
 

References

  • : Databricks KB: “Unauthorized token type db-internal” error when calling the REST API from a notebook

  • : Databricks Unity Catalog API authentication requirements

In conclusion, replace notebook-internal SDK authentication with a PAT or service principal for these catalog API operations, running them outside the notebook, or via a %python step using the proper PAT if you must automate within Databricks workflows .

 

View solution in original post

1 REPLY 1

mark_ott
Databricks Employee
Databricks Employee

The error occurs because the Databricks Python SDK (databricks-sdk) and the authentication method within an Azure Databricks notebook use a special “db-internal” token for user-based notebook execution, which does not have permission to perform some sensitive Unity Catalog (UC) actions, specifically API calls that manage catalog isolation or binding (e.g., UpdateCatalog isolation mode) .

Why You’re Hitting the Error

  • The built-in notebook credentials (“db-internal” runtime token) have limited scope and are not permitted for sensitive back-end UC operations, even if the calling user is an admin or catalog owner .

  • This limitation is specific to certain Unity Catalog API calls like catalog isolation, not generic catalog actions (e.g., permissions, creation) which sometimes succeed with notebook tokens.

  • Direct REST or SDK calls for catalog binding must use a personal access token (PAT) or service principal, not the runtime token used inside notebooks.

Workarounds and Solutions

1. Use a Personal Access Token (PAT) or Service Principal

Generate a PAT from your user settings in Databricks and use this for the Python SDK authentication, instead of default notebook-based credentials:

python
from databricks.sdk import WorkspaceClient w = WorkspaceClient( host="https://<your-instance>.azuredatabricks.net", token="<your-personal-access-token>" ) _ = w.catalogs.update(name=newCatalogName, isolation_mode=CatalogIsolationMode.ISOLATED)
  • Use this in your script, or securely load the token via environment variables (recommended for security) .

2. Run Sensitive Tasks Outside Notebooks

  • Perform the binding or catalog update steps using a Python script or automation running outside Databricks notebooks (e.g., in CI/CD pipelines, locally), where you can use true PAT or service principal authentication rather than the internal notebook tokens .

3. API Permissions and Catalog Ownership

  • Double-check catalog permissions in Unity Catalog: you need MODIFY and ownership permissions, but even with those, notebook/backed SSO tokens lack the API-level privilege enforced by Databricks security model .

4. Cluster Type

  • The restriction applies regardless of personal/shared cluster, because it’s about token source, not cluster type.

5. Azure SSO

  • SSO/brokered credentials are translated into a db-internal token for notebook/interpreter execution; you cannot elevate this to a proper PAT within the same notebook session.

Summary Table

API Action Notebook "db-internal" Token PAT/Service Principal Token
Create catalog, schema, grants Works Works
UpdateCatalog (isolation binding) Fails (PermissionDenied) Works
 
 

References

  • : Databricks KB: “Unauthorized token type db-internal” error when calling the REST API from a notebook

  • : Databricks Unity Catalog API authentication requirements

In conclusion, replace notebook-internal SDK authentication with a PAT or service principal for these catalog API operations, running them outside the notebook, or via a %python step using the proper PAT if you must automate within Databricks workflows .

 

Join Us as a Local Community Builder!

Passionate about hosting events and connecting people? Help us grow a vibrant local community—sign up today to get started!

Sign Up Now