cancel
Showing results for 
Search instead for 
Did you mean: 
Administration & Architecture
Explore discussions on Databricks administration, deployment strategies, and architectural best practices. Connect with administrators and architects to optimize your Databricks environment for performance, scalability, and security.
cancel
Showing results for 
Search instead for 
Did you mean: 

von Google Cloud Storage

refah_1
New Contributor

Hi everyone,

I'm new to Databricks and am trying to connect my Google Cloud Storage bucket to my Databricks workspace. I have a 43GB CSV file stored in a GCP bucket that I want to work with. Hereโ€™s what I've done so far:

  1. Bucket Setup:

    • I created a GCP bucket (in the west6 region) where my CSV file is stored.
  2. Databricks Configuration:

    • I have a Databricks workspace (in the west2 region).
    • I created a storage credential in Unity Catalog using a GCP Service Account, and I noted down the service account email.
  3. IAM Roles:

    • In the Google Cloud Console, I granted the service account the Storage Legacy Bucket Reader and Storage Object Admin roles on my bucket.
  4. External Location:

    • I attempted to create an external location in Databricks, pointing to gs://<my-bucket-name>/, using the storage credential I created.

Despite following these steps, Iโ€™m unable to see or access my CSV file from Databricks. Iโ€™m not sure if the region difference (bucket in west6 vs. workspace in west2) or something else is causing the issue.

Has anyone experienced a similar problem or can provide guidance on troubleshooting this connection? Any help would be greatly appreciated!

Thanks in advance!

0 REPLIES 0

Join Us as a Local Community Builder!

Passionate about hosting events and connecting people? Help us grow a vibrant local communityโ€”sign up today to get started!

Sign Up Now