Hi — this is expected behavior, not a bug. Unity Catalog storage credentials in the UI are cloud-specific to your workspace deployment. Since your workspace runs on AWS, you only see AWS IAM Role and Cloudflare API Token. The GCP Service Account option only appears on GCP-deployed Databricks workspaces.
How to Access GCS from an AWS Databricks Workspace
Unity Catalog external locations don't support cross-cloud storage credentials, but you have a few options:
Option 1: GCS Connector + Service Account Key (most common)
Upload the GCS connector JAR and authenticate using a GCP service account key stored in a Databricks secret scope:
# Store your GCP SA key JSON in a secret scope first:
# databricks secrets put-secret --scope gcp --key sa-key --string-value '<json>'
service_account_key = dbutils.secrets.get("gcp", "sa-key")
spark.conf.set("fs.gs.auth.type", "SERVICE_ACCOUNT_JSON_KEYFILE")
spark.conf.set("fs.gs.auth.service.account.json.keyfile", service_account_key)
# OR write the key to a temp file and use:
# spark.conf.set("fs.gs.auth.service.account.json.keyfile", "/tmp/sa-key.json")
df = spark.read.format("parquet").load("gs://your-bucket/path/")
You'll need the gcs-connector JAR installed on your cluster (add it via cluster Libraries tab or init script).
Option 2: GCS S3-Compatible API with HMAC Keys
GCS supports S3-compatible access. Create HMAC keys in GCP, then use the S3A connector:
spark.conf.set("fs.s3a.endpoint", "https://storage.googleapis.com")
spark.conf.set("fs.s3a.access.key", dbutils.secrets.get("gcp", "hmac-access-key"))
spark.conf.set("fs.s3a.secret.key", dbutils.secrets.get("gcp", "hmac-secret-key"))
df = spark.read.format("parquet").load("s3a://your-gcs-bucket/path/")
Option 3: Delta Sharing (if data is on a GCP Databricks workspace)
If the GCS data is managed by another Databricks workspace on GCP, the cleanest approach is Delta Sharing — share the tables from the GCP workspace and consume them in your AWS workspace. No cross-cloud credentials needed.
Summary
|
Approach
|
Unity Catalog Governed
|
Needs JAR
|
Complexity
|
|
GCS Connector + SA Key
|
No
|
Yes
|
Medium
|
|
HMAC / S3-Compatible
|
No
|
No
|
Low
|
|
Delta Sharing
|
Yes
|
No
|
Low
|
Note: Options 1 and 2 bypass Unity Catalog governance (no external locations / storage credentials). If governance is a requirement, Delta Sharing is the recommended path.
Docs:
Anuj Lathi
Solutions Engineer @ Databricks