cancel
Showing results for 
Search instead for 
Did you mean: 
Machine Learning
Dive into the world of machine learning on the Databricks platform. Explore discussions on algorithms, model training, deployment, and more. Connect with ML enthusiasts and experts.
cancel
Showing results for 
Search instead for 
Did you mean: 

Forum Posts

TomasP
by New Contributor III
  • 1859 Views
  • 3 replies
  • 0 kudos

Two or more different ml model on one cluster.

Hi, have you already dealt with the situation that you would like to have two different ml models in one cluster? i.e: I have a project which contains two or more different models with more different pursposes. The goals is to have three differ...

  • 1859 Views
  • 3 replies
  • 0 kudos
Latest Reply
Anonymous
Not applicable
  • 0 kudos

Hi @Tomas Peterek​ Hope all is well! Just wanted to check in if you were able to resolve your issue and would you be happy to share the solution or mark an answer as best? Else please let us know if you need more help. We'd love to hear from you.Than...

  • 0 kudos
2 More Replies
confusedIntern
by New Contributor III
  • 3611 Views
  • 4 replies
  • 2 kudos

Uploaded Docker image into cluster. Used cluster for MLFlow experiment, but no experiment is logged/there are no experiment runs. Why is this?

Hi! So I used this MLFlow experiment I found from the databricks website: https://docs.databricks.com/_static/notebooks/machine-learning-with-unity-catalog.htmlAnd I created this cluster using a custom Docker image I created myself:  Usually when I c...

Screen Shot 2022-08-02 at 4.13.14 PM Screen Shot 2022-08-02 at 4.17.10 PM Screen Shot 2022-08-02 at 4.17.47 PM
  • 3611 Views
  • 4 replies
  • 2 kudos
Latest Reply
Debayan
Databricks Employee
  • 2 kudos

Have you tried the steps mentioned in the below URL:https://docs.databricks.com/clusters/custom-containers.html#step-3-launch-your-cluster

  • 2 kudos
3 More Replies
Saeed
by New Contributor II
  • 6459 Views
  • 2 replies
  • 1 kudos

Resolved! MLFlow search runs getting http 429 error

I am facing an issue in loading a ML artifact for a specific run by search the experiment runs to get a specific run_id as follows:https://www.mlflow.org/docs/latest/rest-api.html#search-runsAPI request to https://eastus-c3.azuredatabricks.net/api/2....

  • 6459 Views
  • 2 replies
  • 1 kudos
Latest Reply
sean_owen
Databricks Employee
  • 1 kudos

Yes, you will hit rate limits if you try to query the API so fast in parallel. Do you just want to manipulate the run data in an experiment with Spark? you can simply load all that data in a DataFrame with spark.read.format("mlflow-experiment").load(...

  • 1 kudos
1 More Replies
User16826994223
by Honored Contributor III
  • 1927 Views
  • 1 replies
  • 0 kudos

Resolved! Exception: Run with UUID l567845ae5a7cf04a40902ae789076093c is already active.

I'm trying to create a new experiment on mlflow but I have this problem:Exception: Run with UUID l142ae5a7cf04a40902ae9ed7326093c is already active. snippet mlflow.set_experiment("New experiment 2")     mlflow.set_tracking_uri('http://mlflow:5000')  ...

  • 1927 Views
  • 1 replies
  • 0 kudos
Latest Reply
User16826994223
Honored Contributor III
  • 0 kudos

You have to run mlflow.end_run() to finish the first experiment. Then you can create another

  • 0 kudos
User16826994223
by Honored Contributor III
  • 3536 Views
  • 2 replies
  • 0 kudos

Resolved! Can we delte Mlflow experiment

I am using ML flow and my need of the hour is to delete an experiment and want to create another experiment with same run.client = MlflowClient(tracking_uri=server) client.delete_experiment(1)This deletes the experiment, but when I run a new experim...

  • 3536 Views
  • 2 replies
  • 0 kudos
Latest Reply
User16826994223
Honored Contributor III
  • 0 kudos

SQL Database:This is more tricky, as there are dependencies that need to be deleted. I am using MySQL, and these commands work for me:USE mlflow_db; # the name of your database DELETE FROM experiment_tags WHERE experiment_id=ANY( SELECT experime...

  • 0 kudos
1 More Replies
Labels