Is GCP Workload Identity Federation supported for BigQuery connections in Azure Databricks?

ciaran
New Contributor

I’m trying to set up a BigQuery connection in Azure Databricks (Unity Catalog / Lakehouse Federation) using GCP Workload Identity Federation (WIF) instead of a GCP service account key

Environment:

  1. Azure Databricks workspace
  2. BigQuery query federation via Unity Catalog
  3. GCP Workload Identity Pool + OIDC provider configured for Azure AD
  4. Azure Managed Identity / App Registration issuing OIDC tokens
  5. GCP Service Account with roles/iam.workloadIdentityUser binding to the pool/provider

Config Example:

{
  "type": "external_account",
  "audience": "//iam.googleapis.com/projects/.../providers/...",
  "subject_token_type": "urn:ietf:params:oauth:token-type:jwt",
  "token_url": "https://sts.googleapis.com/v1/token",
  "service_account_impersonation_url": "https://iamcredentials.googleapis.com/v1/projects/-/serviceAccounts/...",
  "credential_source": {
    "url": "http://169.254.169.254/metadata/identity/oauth2/token?api-version=2018-02-01&resource=api://AzureADTokenExchange",
    "headers": { "Metadata": "True" },
    "format": { "type": "json", "subject_token_field_name": "access_token" }
  }
}

Issue:

When creating the BigQuery connection, Databricks shows error: Google Server Account OAuth Private Key has to be a valid JSON object from the KEYS section…

This looks like the connector only accepts private service account key JSON.

Question:

Is GCP Workload Identity Federation officially supported for BigQuery connections in Azure Databricks today? If so, is there a different credential format required?