To resolve the ValueError: default auth: cannot configure default credentials
error when using dbutils
from WorkspaceClient
in a Databricks notebook, follow these steps:
-
Ensure SDK Installation: Make sure the Databricks SDK for Python is installed. You can install it using the %pip
magic command in a notebook cell (-U or --upgrade):
%pip install -U databricks-sdk
-
Restart Python: After installing the SDK, restart the Python environment to make the installed library available:
dbutils.library.restartPython()
-
Use Default Notebook Authentication: The Databricks notebook automatically handles authentication. You can directly use the WorkspaceClient
without setting environment variables:
from databricks.sdk import WorkspaceClient
# Initialize WorkspaceClient
w = WorkspaceClient()
# List files in the specified directory
d = w.dbutils.fs.ls('/')
for f in d:
print(f.path)
By following these steps, you should be able to resolve the authentication issue and run your code successfully within a Databricks notebook.
Some troubleshooting steps that can help you find the problem:
Print the environment variables, look for DATABRICKS_HOST and/or DATABRICKS_TOKEN.
import os
for key, value in os.environ.items():
print(f'{key}: {value}')
You may also share the full stacktrace here, I'm assuming there is more printed along with the "ValueError: default auth: cannot configure default credentials". Are you trying this in a Serverless or Classic Cluster? Which DBR release?