Hello,
I’ve been trying to set up my local MLflow client to work with Databricks Community Edition, but I’m running into issues with authentication. I followed the official setup guide for integrating MLflow with Databricks, but when I try to run any MLflow commands (e.g., logging experiments, accessing the Databricks tracking URI), I get the following error message:
mlflow.utils.credentials: No valid Databricks credentials found, please enter your credentials... Official Site
I’m using Databricks Community Edition at the URL https://community.cloud.databricks.com and have set up a workspace there. However, I suspect that my environment variables or the mlflow.login() function aren’t recognizing my credentials properly.
So far, I’ve made sure that:
I've created a Databricks personal access token (PAT).
I've set the DATABRICKS_HOST and DATABRICKS_TOKEN environment variables locally.
I've tried both the mlflow.set_tracking_uri() and mlflow.login() functions but still encounter the same issue.
Has anyone successfully connected their local MLflow client to Databricks Community Edition? If so, what steps did you take to troubleshoot or resolve the authentication issue? Are there any additional configuration steps or tips you could recommend?
Thanks in advance!