We try to migrate our old infra to Unity Catalog.
We have some pipelines which write to BigQuery tables.
To enable Unity Catalog to cluster level we have 2 options (Single user and Shared).
Unfortunately we tried by using a Shared (Access mode) cluster to write data to BigQuery but we couldn't achieve it even if we followed the instructions described in (https://docs.databricks.com/en/connect/external-systems/bigquery.html).
We CAN read data but we CANNOT write any. The error is:
org.apache.spark.SparkException: java.io.IOException: Error getting access token from metadata server at: Caused by: org.apache.spark.SparkException: shaded.databricks.com.google.api.client.http.HttpResponseException: 404 Not Found
Any help is appreciated. 🙂