I am creating a data frame by reading a table's data residing in Azure backed unity catalog. I need to write the dataframe or file to GCS bucket. I have configured the spark cluster config using the GCP service account json values. Also tried uploading it to dbfs and accessed the same through .config ()
on running :
df1.write.format("parquet").save("gs://dev-XXXX-analyt-XXXXXXXX") getting error :
Error getting access token from metadata server at: http://169.254.169.254/computeMetadata/v1/instance/service-accounts/default/token
what could be the reason or resolution. Need help