I am following this guide on writing data to the BigQuery table.
Right now, I have an error when I try to write data using Databricks Secret instead of the JSON credential file and setting the GOOGLE_APPLICATION_CREDENTIALS environment variable.
java.io.IOException: Error getting access token from metadata server at: http://169.x.x.x/computeMetadata/v1/instance/service-accounts/default/token
What is strange here is I can write/read the data using the GOOGLE_APPLICATION_CREDENTIALS environment variable. I can also read the data using Databricks Secret. So I don't know why it is not work when I write the data with Databricks Secret.
Here is my code to read the Databricks Secret:
import base64
cred = dbutils.secrets.get(scope="bigquery-scope", key="secret-name").encode('ascii')
cred = base64.b64encode(cred)
cred = cred.decode('ascii')
spark.conf.set("credentials", cred)
Below is my code to read/write the data:
# Read data
df = spark.read.format("bigquery")
.option("parentProject", <parent-project-id>)
.option("viewsEnabled","true")
.option("table", <table-name>)
.load()
# Write data
df.write.format("bigquery") \
.mode("overwrite") \
.option("temporaryGcsBucket", <bucket-name>) \
.option("table", <table-name>) \
.option("parentProject", <parent-project-id>) \
.save()
Am I missing any configuration for writing the data to BigQuery with Databricks Secret?