@LeoRickli wrote:
I have a Databricks workspace that is attached to a GCP Service Account from a project named "random-production-data". I want to create a cluster (compute) on Databricks that uses a different Service Account from another project for isolation purposes. Whenever I try to create this cluster, it returns me as invalid. So here is the question: Receiptify
Using Databricks on GCP, is it possible to create a cluster using a Service Account from another project that was not attached to the workspace when it was created?
In Databricks on GCP, clusters are typically created using the service account that is attached to the workspace. This service account is configured at the workspace level and is generally not interchangeable with service accounts from other GCP projects.
If you want to create a cluster using a different service account from another project, you'll need to ensure that the desired service account has the appropriate permissions and roles associated with the Databricks workspace. However, as of my last update, you cannot directly use a service account from a different project for a Databricks cluster unless that service account is explicitly granted access to the Databricks workspace.
To achieve your goal of isolating environments using different service accounts, you might consider the following approaches:
Create a New Workspace: Set up a new Databricks workspace using the desired service account. This way, you can have complete control over the service account used for clusters.
Service Account Permissions: If you want to stay within the same workspace, ensure that the service account from the other project has the necessary permissions to access the resources required by your Databricks cluster. You may need to adjust IAM roles and permissions.
Cross-Project Access: If using resources across projects, ensure that any necessary permissions are configured for the service accounts involved.
Unfortunately, you can't just switch the service account for a cluster once the workspace is created. You might want to check the latest Databricks and GCP documentation for any updates on this functionality.