Shared Access mode cluster FAILS to write data to BigQuery
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
12-13-2024 09:46 AM - edited 12-13-2024 09:52 AM
We try to migrate our old infra to Unity Catalog.
We have some pipelines which write to BigQuery tables.
To enable Unity Catalog to cluster level we have 2 options (Single user and Shared).
Unfortunately we tried by using a Shared (Access mode) cluster to write data to BigQuery but we couldn't achieve it even if we followed the instructions described in (https://docs.databricks.com/en/connect/external-systems/bigquery.html).
We CAN read data but we CANNOT write any. The error is:
org.apache.spark.SparkException: java.io.IOException: Error getting access token from metadata server at: Caused by: org.apache.spark.SparkException: shaded.databricks.com.google.api.client.http.HttpResponseException: 404 Not Found
Any help is appreciated. 🙂
- Labels:
-
Unity Catalog
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
12-13-2024 12:35 PM
Based on the context provided, here are some potential causes and solutions:
-
Metadata Server URL: Ensure that the URL
http://169.254.169.254/computeMetadata/v1/instance/service-accounts/default/token
is accessible from your environment. This URL is used to fetch the access token for the default service account. -
Service Account Permissions: Verify that the service account associated with your Databricks cluster has the necessary permissions to access the required resources. The service account should have roles such as
Storage Blob Data Contributor
assigned at the appropriate level (e.g., storage account level). -
Service Account Configuration: Check if the service account being used is correctly configured and has not been deleted or recreated without updating the corresponding credentials in Databricks. If the service account was recreated, you might need to refresh or update the storage credentials.
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
12-13-2024 01:01 PM - edited 12-13-2024 01:04 PM
Is any relative document I could read to check these possible solutions?
What I can't get is, why I am able to read whereas I cannot write, using the same cluster and the same configuration (credentials).
When I change the cluster from Shared to Single user with the same credentials, everything works perfect. Seems to be Access mode Shared problem, but I cannot get what's going wrong.
Thank you for the reply.
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
12-15-2024 07:01 PM
If you do a SHOW GRANTS on this cluster what does it shows? if you have SELECT on ANY File, then probably you just need to grant MODIFY on any file as described here https://docs.databricks.com/en/data-governance/table-acls/any-file.html

