Is GCP Workload Identity Federation supported for BigQuery connections in Azure Databricks?
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
Wednesday
I’m trying to set up a BigQuery connection in Azure Databricks (Unity Catalog / Lakehouse Federation) using GCP Workload Identity Federation (WIF) instead of a GCP service account key
Environment:
- Azure Databricks workspace
- BigQuery query federation via Unity Catalog
- GCP Workload Identity Pool + OIDC provider configured for Azure AD
- Azure Managed Identity / App Registration issuing OIDC tokens
- GCP Service Account with roles/iam.workloadIdentityUser binding to the pool/provider
Config Example:
{
"type": "external_account",
"audience": "//iam.googleapis.com/projects/.../providers/...",
"subject_token_type": "urn:ietf:params:oauth:token-type:jwt",
"token_url": "https://sts.googleapis.com/v1/token",
"service_account_impersonation_url": "https://iamcredentials.googleapis.com/v1/projects/-/serviceAccounts/...",
"credential_source": {
"url": "http://169.254.169.254/metadata/identity/oauth2/token?api-version=2018-02-01&resource=api://AzureADTokenExchange",
"headers": { "Metadata": "True" },
"format": { "type": "json", "subject_token_field_name": "access_token" }
}
}Issue:
When creating the BigQuery connection, Databricks shows error: Google Server Account OAuth Private Key has to be a valid JSON object from the KEYS section…
This looks like the connector only accepts private service account key JSON.
Question:
Is GCP Workload Identity Federation officially supported for BigQuery connections in Azure Databricks today? If so, is there a different credential format required?
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
yesterday
I guess that it is only one accepted as doc say "Google service account key json"
My blog: https://databrickster.medium.com/