cancel
Showing results for 
Search instead for 
Did you mean: 
Machine Learning
Dive into the world of machine learning on the Databricks platform. Explore discussions on algorithms, model training, deployment, and more. Connect with ML enthusiasts and experts.
cancel
Showing results for 
Search instead for 
Did you mean: 

Unable to create model version using rest api on Managed MLFlow on GCP. Getting a Failed Registration.

sroychow
New Contributor

I am trying to use Managed MLFlow as tracking server on GCP. I use rest apis to connect with the MLFLOW using Databricks token.

I can create experiment and even the model but what when I try to create a model version I run into this following error.

image.pngEarlier, I was able to use MLFlow Server successfully on AWS when I used something like the following:

mlflow server \
    --backend-store-uri <postgres-sql-db> \
    --default-artifact-root s3://my-mlflow-bucket/ \
    --host 0.0.0.0

In that case, I installed MLFlow as tracking server in an EC2 instance and the rest apis worked fine.

At this point, I would like to know what can I do because I do not access to any Databricks compute instance.

These are my questions:

-- Is there a way to check and find out what are the backend stores and artifact stores that are currently being used on a managed instance?

-- Is it possible to change its settings while the instance is running? If not, is there a way to stop the instance and change the settings? or a new instance?

-- Is there a way to see the contents of the backend store even though it is managed by Databricks?

-- I need to know where and how the model uris are stored so that we can appropriately set parameters in the rest apis.

My design at this point is to use the Managed MLFlow that takes care of the backend store but I can have some control over the GCS bucket.

I would really appreciate if I can some feedback.

Thanks

Shounak

2 REPLIES 2

Atanu
Databricks Employee
Databricks Employee

You’ll need to either move the training to the run or manually log the model yourself if you want to log the model as an artifact of the run and register it the way you’re doing it.

I am not sure if that really help here.

jose_gonzalez
Databricks Employee
Databricks Employee

Hi @Shounak Roychowdhury​,

Just a friendly follow-up. Do you still need help or you were able to find the solution to this question? please let us know

Connect with Databricks Users in Your Area

Join a Regional User Group to connect with local Databricks users. Events will be happening in your city, and you won’t want to miss the chance to attend and share knowledge.

If there isn’t a group near you, start one and help create a community that brings people together.

Request a New Group