cancel
Showing results for 
Search instead for 
Did you mean: 
Data Governance
cancel
Showing results for 
Search instead for 
Did you mean: 

Unable to Access external location on GCP from databricks on GCP.

shiva12494
New Contributor II

I am trying to add a GCP storage as an external location and read and write Unity catalog enabled delta tables to that external location in GCP databricks. I keep getting the error that the databricks instance service principal doesn't have access to the storage location. We have granted access to the service principal and also to the buckets but it doesn't seem to work. we dint face this kind of issue on AWS. Any guidance is appreciated.Screen Shot 2023-04-14 at 3.28.04 PM

5 REPLIES 5

Anonymous
Not applicable

@shiva charan velichala​ :

If you have already granted access to the Databricks instance service principal and the GCP Storage buckets, there are a few additional things you can try to resolve this issue:

  1. Verify the access control settings: Make sure that you have granted the Databricks instance service principal the necessary IAM roles and permissions to access the GCP Storage buckets. You can check the access control settings by navigating to the IAM & Admin section of the GCP console.
  2. Check the network settings: Ensure that the network settings for the GCP Storage buckets are configured correctly. You may need to allow inbound traffic from the Databricks cluster or VPC network to the GCP Storage buckets.
  3. Verify the credentials: Double-check that you are using the correct credentials to authenticate with the GCP Storage buckets. If you are using a service account key file, make sure that the key file is valid and has the necessary permissions.
  4. Check the region settings: Ensure that the GCP Storage buckets and the Databricks cluster are located in the same region. If they are in different regions, you may need to configure cross-region access.
  5. Try a different approach: If the above steps do not work, you may want to try a different approach for accessing the GCP Storage buckets, such as using the gsutil command-line tool or the GCP Storage API directly.

Anonymous
Not applicable

Hi @shiva charan velichala​ 

Hope everything is going great.

Just wanted to check in if you were able to resolve your issue. If yes, would you be happy to mark an answer as best so that other members can find the solution more quickly? If not, please tell us so we can help you. 

Cheers!

karthik_p
Esteemed Contributor

adding to @Suteja Kanuri​ , @shiva charan velichala​ can you please recheck if reader access as shown below has been added to your externally created bucket

On the Permission tab, click + Grant access and assign the service account the following roles:

  • Storage Legacy Bucket Reader
  • Storage Object Admin

Linda
New Contributor II

Have you found out the solution?

shiva12494
New Contributor II

No, i tried the approachs from 1 to 4 and none of them worked. We want to use Dbutils and not GSutils as that would require a lot of code changes. Currently dbutils is not working

Welcome to Databricks Community: Lets learn, network and celebrate together

Join our fast-growing data practitioner and expert community of 80K+ members, ready to discover, help and collaborate together while making meaningful connections. 

Click here to register and join today! 

Engage in exciting technical discussions, join a group with your peers and meet our Featured Members.