cancel
Showing results forย 
Search instead forย 
Did you mean:ย 
Administration & Architecture
Explore discussions on Databricks administration, deployment strategies, and architectural best practices. Connect with administrators and architects to optimize your Databricks environment for performance, scalability, and security.
cancel
Showing results forย 
Search instead forย 
Did you mean:ย 

[Unity Catalog] Lack of Credential Type When GCS Interworking in Database ricks in AWS Environment

Junhyeon-Jeon_1
Visitor

Hi, I'm using Databricks in AWS environment and
I'm trying to link the data from GCP GCS to Unity Catalog.

[Official document]
I tried to set it up by referring to the official guide of Databricks below.

โ–ถ Create service credentials guides

[Problem situation]
The document says that Catalog โ†’ External Data โ†’ Credentials should create credentials by selecting the GCP Service Account type, but in my environment.

Menu mismatch: When selecting Credential Type, only AWS IAM Role and Cloudflare API Token are exposed, and the GCP Service Account option specified in the document is not visible at all. (Please check the attached image.)

์Šคํฌ๋ฆฐ์ƒท 2026-04-09 100042.png

Environmental Differences: Our team's databricks workspace is currently running in AWS environments.

[Question]

  • Is it normal for AWS-based workspaces to restrict GCP Credentials generation via UI?
  • If you need AWS to GCP interworking, like me, I'm wondering if you need to force create credentials using CLI or API instead of UI, or if there's a way to enable other cloud credentials in the meta store settings.
  • Or, if you have a separate best practice to connect GCS in AWS environment, please share it

[Environmental Information]

  • Runtime: 16.3
  • Unity Catalog: Enabled
  • Cloud Environment: AWS
1 REPLY 1

anuj_lathi
Databricks Employee
Databricks Employee

Hi โ€” this is expected behavior, not a bug. Unity Catalog storage credentials in the UI are cloud-specific to your workspace deployment. Since your workspace runs on AWS, you only see AWS IAM Role and Cloudflare API Token. The GCP Service Account option only appears on GCP-deployed Databricks workspaces.

How to Access GCS from an AWS Databricks Workspace

Unity Catalog external locations don't support cross-cloud storage credentials, but you have a few options:

Option 1: GCS Connector + Service Account Key (most common)

Upload the GCS connector JAR and authenticate using a GCP service account key stored in a Databricks secret scope:

# Store your GCP SA key JSON in a secret scope first:

# databricks secrets put-secret --scope gcp --key sa-key --string-value '<json>'

 

service_account_key = dbutils.secrets.get("gcp", "sa-key")

 

spark.conf.set("fs.gs.auth.type", "SERVICE_ACCOUNT_JSON_KEYFILE")

spark.conf.set("fs.gs.auth.service.account.json.keyfile", service_account_key)

# OR write the key to a temp file and use:

# spark.conf.set("fs.gs.auth.service.account.json.keyfile", "/tmp/sa-key.json")

 

df = spark.read.format("parquet").load("gs://your-bucket/path/")

 

You'll need the gcs-connector JAR installed on your cluster (add it via cluster Libraries tab or init script).

Option 2: GCS S3-Compatible API with HMAC Keys

GCS supports S3-compatible access. Create HMAC keys in GCP, then use the S3A connector:

spark.conf.set("fs.s3a.endpoint", "https://storage.googleapis.com")

spark.conf.set("fs.s3a.access.key", dbutils.secrets.get("gcp", "hmac-access-key"))

spark.conf.set("fs.s3a.secret.key", dbutils.secrets.get("gcp", "hmac-secret-key"))

 

df = spark.read.format("parquet").load("s3a://your-gcs-bucket/path/")

 

Option 3: Delta Sharing (if data is on a GCP Databricks workspace)

If the GCS data is managed by another Databricks workspace on GCP, the cleanest approach is Delta Sharing โ€” share the tables from the GCP workspace and consume them in your AWS workspace. No cross-cloud credentials needed.

Summary

 

Approach

Unity Catalog Governed

Needs JAR

Complexity

GCS Connector + SA Key

No

Yes

Medium

HMAC / S3-Compatible

No

No

Low

Delta Sharing

Yes

No

Low

Note: Options 1 and 2 bypass Unity Catalog governance (no external locations / storage credentials). If governance is a requirement, Delta Sharing is the recommended path.

Docs:

Anuj Lathi
Solutions Engineer @ Databricks