Showing results for 
Search instead for 
Did you mean: 
Administration & Architecture
Explore discussions on Databricks administration, deployment strategies, and architectural best practices. Connect with administrators and architects to optimize your Databricks environment for performance, scalability, and security.
Showing results for 
Search instead for 
Did you mean: fails to install backend on runtimes > 10.4

New Contributor III

We run Databricks on GCP.  We store our private Python packages in the Google Artifact Registry.  When we need to install the private packages we a global init script to install `keyring` and ``.  The we `pip install -extra-index-url <our index URL> our_packages` either from cluster init scripts or a cell in a notebook.  The clusters have Google Service Accounts that have access to the Artifact Registry. When we do this with a Databricks runtime of 10.4ML or less, everything works as expected, and our dependencies are installed correctly.

When we do this procedure with any runtime greater than 10.4, it fails. Specifically the `` does not or is unable to set up the need backend with `keyring`.  When we run the command



keyring --list-backends


It does not list `GooglePythonAuth` as one of the backends.  I can run `%pip freeze` and verify that `keyring` and `` are installed by pip.  I can run `%pip install` in a notebook successfully too, but the `GooglePythonAuth` keyring is never setup.  This only happens on runtimes greater than 10.4.

How can get the keyring to install or setup on later versions?



Community Manager
Community Manager

Hi @Ryan512, It seems you’re encountering an issue with not setting up the necessary backend with keyring on Databricks runtimes greater than 10.4.

  1. Check Compatibility:

  2. Update Dependencies:

    • Ensure that you’re using the latest versions of both keyring and Run the following commands in a notebook cell to update them:
      %pip install --upgrade keyring
    • After updating, restart your cluster and try again.
  3. Verify Service Account Permissions:

    • Confirm that the Google Service Account associated with your Databricks cluster has the necessary permissions to access the Google Artifact Registry.
    • Make sure the service account has read access to the registry where your private Python packages are stored.
  4. Check Environment Variables:

    • Verify that the environment variables required are correctly set. These variables typically include credentials or authentication tokens.
    • Double-check the configuration settings related to the Google Artifact Registry in your Databricks cluster.
  5. Databricks Runtime Changes:

  6. Debugging:

    • If the issue persists, consider enabling debug logs for to get more detailed information about what’s going wrong.
    • You can use %sh magic commands to run shell commands within a Databricks notebook and check for any error messages or warnings.

Good luck! 😊🚀

New Contributor II


Join 100K+ Data Experts: Register Now & Grow with Us!

Excited to expand your horizons with us? Click here to Register and begin your journey to success!

Already a member? Login and join your local regional user group! If there isn’t one near you, fill out this form and we’ll create one for you to join!