Hi @naga_databricks ,
Yes, there are differences in the accessibility to cloud resources between two types of Databricks clusters: Single-user and Shared clusters.
Single-user clusters are associated with the user's identity and they have access to the user's cloud resources, such as Google Cloud Platform (GCP) service accounts and keys. Therefore, when you run a notebook using a Single-user cluster, the notebook has access to the user's GCP service account.
Shared clusters, on the other hand, are not associated with a single user identity and they run under a Databricks-managed service account that has limited access to cloud resources. This means that if you want to access your GCP service account from a Shared cluster, you will need to explicitly grant access to the service account used by the Shared cluster.
To grant access to your GCP service account, you can follow these steps:
-
Create a GCP service account and grant it the necessary permissions to access your secret in Secret Manager.
-
Generate a key for the service account and download it as a JSON file.
-
Create a Databricks secret scope and store the service account key in the secret scope. You can use the Databricks CLI or REST API to create the secret scope and store the service account key.
-
Grant the necessary permissions to the Databricks-managed service account to access the secret scope. You can grant the necessary permissions using the gcloud
CLI tool or the GCP Console.
-
In your notebook, use the dbutils.secrets.get()
method to access the service account key from the secret scope and authenticate the GCP client with the key.
By following these steps, you can grant access to your GCP service account from a Shared cluster and access your secrets without running into authentication errors.