cancel
Showing results for 
Search instead for 
Did you mean: 
Machine Learning
Dive into the world of machine learning on the Databricks platform. Explore discussions on algorithms, model training, deployment, and more. Connect with ML enthusiasts and experts.
cancel
Showing results for 
Search instead for 
Did you mean: 

Docker image with libraries + MLFlow Experiments

fsimoes
New Contributor II

Hi everybody,

I have a scenario where we have multiple teams working with Python and R, and this teams uses a lot of different libraries. 

Because of this dozen of libraries, the cluster start took much time. Then I created a Docker image, where I can install all the libraries that we need, for both Python and R.

It works fine, the cluster start isn't a problem anymore. But now, MLFlow experiment doesn't working any more, it saves data in a path like "/tmp/Rserv", and we can't see on the IU Experments anymore.

How can I do to have my libraries on this Docker image, but having access to the experiments on the UI on Databricks?

1 ACCEPTED SOLUTION

Accepted Solutions

UTC_IT_Technolo
New Contributor III

If you have created a Docker image with all the necessary libraries for your Python and R projects, and you are facing issues with MLFlow experiments not working and data not being visible in the UI on Databricks, here are a few steps you can take to address the problem:

Mount a shared volume: Ensure that you have mounted a shared volume or directory between your Docker container and Databricks. This will allow MLFlow to save the experiment data to a location that is accessible by Databricks. www.mymilestonecard.com

Set the MLFlow artifact URI: In your Docker image or when starting your container, set the MLFLOW_ARTIFACT_URI environment variable to the mounted directory or a location accessible by Databricks. This tells MLFlow where to store the experiment artifacts.

Verify permissions and access: Ensure that the mounted directory has the necessary permissions and that the Databricks environment has the appropriate access to read/write data to that location. Check the Databricks documentation for more details on configuring access and permissions.

Update MLFlow tracking URI: If you are using a remote MLFlow server or custom tracking server, ensure that you have updated the MLFlow tracking URI to point to the correct location.

By following these steps, you should be able to use your Docker image with the necessary libraries and still have access to the MLFlow experiments in the Databricks UI. Remember to review the MLFlow and Databricks documentation for more specific instructions and examples based on your setup.

View solution in original post

2 REPLIES 2

UTC_IT_Technolo
New Contributor III

If you have created a Docker image with all the necessary libraries for your Python and R projects, and you are facing issues with MLFlow experiments not working and data not being visible in the UI on Databricks, here are a few steps you can take to address the problem:

Mount a shared volume: Ensure that you have mounted a shared volume or directory between your Docker container and Databricks. This will allow MLFlow to save the experiment data to a location that is accessible by Databricks. www.mymilestonecard.com

Set the MLFlow artifact URI: In your Docker image or when starting your container, set the MLFLOW_ARTIFACT_URI environment variable to the mounted directory or a location accessible by Databricks. This tells MLFlow where to store the experiment artifacts.

Verify permissions and access: Ensure that the mounted directory has the necessary permissions and that the Databricks environment has the appropriate access to read/write data to that location. Check the Databricks documentation for more details on configuring access and permissions.

Update MLFlow tracking URI: If you are using a remote MLFlow server or custom tracking server, ensure that you have updated the MLFlow tracking URI to point to the correct location.

By following these steps, you should be able to use your Docker image with the necessary libraries and still have access to the MLFlow experiments in the Databricks UI. Remember to review the MLFlow and Databricks documentation for more specific instructions and examples based on your setup.

Anonymous
Not applicable

Hi @Fabio Simoes​ 

Thank you for posting your question in our community! We are happy to assist you.

To help us provide you with the most accurate information, could you please take a moment to review the responses and select the one that best answers your question?

This will also help other community members who may have similar questions in the future. Thank you for your participation and let us know if you need any further assistance! 

Join 100K+ Data Experts: Register Now & Grow with Us!

Excited to expand your horizons with us? Click here to Register and begin your journey to success!

Already a member? Login and join your local regional user group! If there isn’t one near you, fill out this form and we’ll create one for you to join!