โ01-28-2025 10:56 PM
Hey everyone,
Weโre trying to access secrets stored in GCP Secret Manager using its Python package from Databricks on GCP. However, we can only reach the Secret Manager when using "No Isolation Shared" clusters, which is not an option for us. Currently, we havenโt found any alternative solutions.
Has anyone encountered this issue or found a workaround?
The error message indicates that Googleโs metadata server is unreachable.
Thanks in advance!
Best regards
โ01-29-2025 05:39 AM
Hello @yumnus,
Could you please share the full error trace? what is the package you are installing?
One suggestion, instead of relying on the metadata server, you can use a service account key file to authenticate with GCP Secret Manager. You can store the service account key as a Databricks secret and then use it in your code to authenticate. Hereโs a general approach:
โ01-29-2025 06:28 AM
Hi Alberto,
When we use No Isolation Shared Clusters it works, else:
the error messages:
WARNING:google.auth.compute_engine._metadata:Compute Engine Metadata server unavailable on attempt 1 of 3. Reason: [Errno 111] Connection refused
WARNING:google.auth._default:No project ID could be determined. Consider running `gcloud config set project` or setting the GOOGLE_CLOUD_PROJECT environment variable
WARNING:google.auth.compute_engine._metadata:Compute Engine Metadata server unavailable on attempt 1 of 5. Reason: HTTPConnectionPool(host='metadata.google.internal', port=80): Max retries exceeded with url: /computeMetadata/v1/instance/service-accounts/default/?recursive=true (Caused by NewConnectionError('<urllib3.connection.HTTPConnection object at 0x7931121fe650>: Failed to establish a new connection: [Errno 111] Connection refused'))
google.auth.exceptions.TransportError: Failed to retrieve http://metadata.google.internal/computeMetadata/v1/instance/service-accounts/default/?recursive=true from the Google Compute Engine metadata service. Compute Engine Metadata server unavailable
Thank you!
โ01-29-2025 06:41 AM
Also the package google_cloud_secret_manager-2.22.0
2 weeks ago
The problem is that we shouldnt be using json keys at all if we are running databricks on GCP. with "No Isolation Shared" its able to query the gce metadata service and get the credentials of the service account attached to the instance. There is even a setting for selecting a service account when creating a cluster so it should be supported.
โ07-22-2025 12:42 AM
@yumnus did you ever resolved this error when cluster Is not "No isolation shared" ?
2 weeks ago - last edited 2 weeks ago
This is a huge issue. We are seeing the same thing. google auth is broken for databricks on GCP? Only with no isolation enabled is it able to access the metadata service and get credentials.
Why is the metadata service not reachable? I would be shocked if databricks for GCP doesn't support basic auth integration. I should not have to generate an insecure json key because this is all running inside of gcp.
RefreshError: Failed to retrieve http://metadata.google.internal/computeMetadata/v1/instance/service-accounts/default/?recursive=true from the Google Compute Engine metadata service. Compute Engine Metadata server unavailable due to HTTPConnectionPool(host='metadata.google.internal', port=80): Max retries exceeded with url: /computeMetadata/v1/instance/service-accounts/default/?recursive=true (Caused by NewConnectionError('<urllib3.connection.HTTPConnection object at 0x7b2cd5708200>: Failed to establish a new connection: [Errno 111] Connection refused'))
Can also confirm with curl in notebook
2 weeks ago
Have you tried using UC single user (or group clusters)?
Passionate about hosting events and connecting people? Help us grow a vibrant local communityโsign up today to get started!
Sign Up Now