von Google Cloud Storage
Options
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
โ03-06-2025 02:39 AM
Hi everyone,
I'm new to Databricks and am trying to connect my Google Cloud Storage bucket to my Databricks workspace. I have a 43GB CSV file stored in a GCP bucket that I want to work with. Hereโs what I've done so far:
Bucket Setup:
- I created a GCP bucket (in the west6 region) where my CSV file is stored.
Databricks Configuration:
- I have a Databricks workspace (in the west2 region).
- I created a storage credential in Unity Catalog using a GCP Service Account, and I noted down the service account email.
IAM Roles:
- In the Google Cloud Console, I granted the service account the Storage Legacy Bucket Reader and Storage Object Admin roles on my bucket.
External Location:
- I attempted to create an external location in Databricks, pointing to gs://<my-bucket-name>/, using the storage credential I created.
Despite following these steps, Iโm unable to see or access my CSV file from Databricks. Iโm not sure if the region difference (bucket in west6 vs. workspace in west2) or something else is causing the issue.
Has anyone experienced a similar problem or can provide guidance on troubleshooting this connection? Any help would be greatly appreciated!
Thanks in advance!
0 REPLIES 0

