cancel
Showing results for 
Search instead for 
Did you mean: 
Machine Learning
Dive into the world of machine learning on the Databricks platform. Explore discussions on algorithms, model training, deployment, and more. Connect with ML enthusiasts and experts.
cancel
Showing results for 
Search instead for 
Did you mean: 

ML experiment giving error - RESOURCE_DOES_NOT_EXIST

dbuser24
New Contributor III

Followed the below documentation to create a ML experiment - 
https://docs.databricks.com/aws/en/mlflow/experiments

I created an experiment using the databricks console, then tried running the below code but getting error - getting error - RESOURCE_DOES_NOT_EXIST: Parent directory /Users/<username> does not exist

import mlflow
import os
import numpy as np
from sklearn.linear_model import LinearRegression

experiment_name = "/Users/<username>/my_ml_experiment"
mlflow.set_experiment(experiment_name)

with mlflow.start_run():
mlflow.log_param("alpha", 0.01)
mlflow.log_param("fit_intercept", True)

mlflow.log_metric("rmse", 0.25)
mlflow.log_metric("r2", 0.95)

artifact_dir = "/dbfs/FileStore/mlflow_artifacts"
os.makedirs(artifact_dir, exist_ok=True)

artifact_path = os.path.join(artifact_dir, "info.txt")
with open(artifact_path, "w") as f:
f.write("This is a sample artifact for MLflow logging in Databricks.")

mlflow.log_artifact(artifact_path)

X = np.array([[1], [2], [3], [4]])
y = np.array([2, 4, 6, 8])
model = LinearRegression(fit_intercept=True)
model.fit(X, y)

mlflow.sklearn.log_model(model, "linear_model")

experiment = mlflow.get_experiment_by_name(experiment_name)
print(f"Experiment ID: {experiment.experiment_id}")
print(f"Artifact Location: {experiment.artifact_location}")



2 ACCEPTED SOLUTIONS

Accepted Solutions

szymon_dybczak
Esteemed Contributor III

I've tried run your code on my sandbox environment and I didn't encounter any issues.

I did following steps:

- (1) In my workspace under my username directory I've created ML folder:

szymon_dybczak_0-1754911903637.png


- (2) Next, I went to my target folder (in this case I've create ML directory) and clicked create MfFLow experiment

szymon_dybczak_1-1754911977108.png

- (3) Now, I typed in my expirement name. If you leave artificat location, by default databricks will use following one dbfs:/databricks/mlflow-tracking/<experiment-id>

 

szymon_dybczak_2-1754912106997.png

- (4) Now, I created a new notebook and run the code:

szymon_dybczak_4-1754912444633.png

 

Could you check if you did the same kind of steps?

View solution in original post

dbuser24
New Contributor III

Thanks @szymon_dybczak for the detailed steps. In addition to the above I had to write the logic to create directory if not present to get it working - 

parent_dir = os.path.dirname(experiment_name)
dbutils.fs.mkdirs(parent_dir)

To summarise - 
1. Created the ML experiment from within the directory.
2. Verified and corrected the full path of the experiment.
3. Added permission at the experiment level.
4. Added logic to create the directory if not present.
Screenshot 2025-08-11 at 5.44.04 PM.pngScreenshot 2025-08-11 at 5.54.32 PM.pngScreenshot 2025-08-11 at 5.53.43 PM.pngScreenshot 2025-08-11 at 5.43.52 PM.png

View solution in original post

14 REPLIES 14

BS_THE_ANALYST
Esteemed Contributor

@dbuser24 I think this is just an innocent oversight. 

In your code you have the following:
experiment_name = "/Users/<username>/my_ml_experiment"

I'm assuming you were meant to fill this in and reference a directory that you have available in Databricks, or create one for this exercise. That's why the error is saying "RESOURCE_DOES_NOT_EXIST: Parent directory /Users/<username> does not exist" .. it can't find this location. 

All the best,
BS

dbuser24
New Contributor III

@BS_THE_ANALYST Thanks for your response, there exists a directory with my username, I have just used a placeholder <username> due to company privacy policy. I have also tried the Shared direc

dbuser24
New Contributor III

Does ML experiments need some king of setting at the admin level ? As, other members in the team are also getting the same error.

szymon_dybczak
Esteemed Contributor III

Hi @dbuser24 ,

What type of cluster do you use? Which access mode and runtime?

Hi @szymon_dybczak - I am using the 16.1 ML cluster

BS_THE_ANALYST
Esteemed Contributor

@dbuser24 perhaps a way to determine this would be trying this in the Databricks Free Edition? This could provide some insight. 

By the sounds of the error message, it seems more like a permissioning issue with regard to be able to create/read from a directory. 

If you have a Free Edition, could you try one of the following notebooks/guides (they've worked for me previously)

I imported this notebook from this resource: https://docs.databricks.com/aws/en/mlflow/end-to-end-example If you look at the navigation bar on the left hand side of the website, you'll see there's a few out of the box examples you can just import into your environment. 

I tried this one: https://docs.databricks.com/aws/en/notebooks/source/mlflow/mlflow-classic-ml-e2e-mlflow-3.html and it's worked perfectly fine in my databricks free environment. I'm not too familiar with all the ML components in Databricks but it's working. All it required was for me to create a Catalog called "main". I didn't want to change all the catalog names in the paths that get referenced haha!

Definitely a really cool resource that I've linked though. I'll be learning ML by examples through that. For what it's worth, below is a picture of the ML Notebook working in my environment:

BS_THE_ANALYST_0-1754905331019.png

All the best,
BS

Thanks @BS_THE_ANALYST - Let me try using the above references, and get back on this. Much appreciated !! 

dbuser24
New Contributor III

@BS_THE_ANALYST I imported the notebook - https://docs.databricks.com/aws/en/notebooks/source/mlflow/mlflow-classic-ml-e2e-mlflow-3.html

still, getting the same error - RESOURCE DOES NOT EXISTS at the step : 

with mlflow.start_run() as run:

Screenshot 2025-08-11 at 3.58.49 PM.png

szymon_dybczak
Esteemed Contributor III

I've tried run your code on my sandbox environment and I didn't encounter any issues.

I did following steps:

- (1) In my workspace under my username directory I've created ML folder:

szymon_dybczak_0-1754911903637.png


- (2) Next, I went to my target folder (in this case I've create ML directory) and clicked create MfFLow experiment

szymon_dybczak_1-1754911977108.png

- (3) Now, I typed in my expirement name. If you leave artificat location, by default databricks will use following one dbfs:/databricks/mlflow-tracking/<experiment-id>

 

szymon_dybczak_2-1754912106997.png

- (4) Now, I created a new notebook and run the code:

szymon_dybczak_4-1754912444633.png

 

Could you check if you did the same kind of steps?

sure @szymon_dybczak 

dbuser24
New Contributor III

Thanks @szymon_dybczak for the detailed steps. In addition to the above I had to write the logic to create directory if not present to get it working - 

parent_dir = os.path.dirname(experiment_name)
dbutils.fs.mkdirs(parent_dir)

To summarise - 
1. Created the ML experiment from within the directory.
2. Verified and corrected the full path of the experiment.
3. Added permission at the experiment level.
4. Added logic to create the directory if not present.
Screenshot 2025-08-11 at 5.44.04 PM.pngScreenshot 2025-08-11 at 5.54.32 PM.pngScreenshot 2025-08-11 at 5.53.43 PM.pngScreenshot 2025-08-11 at 5.43.52 PM.png

dbuser24
New Contributor III

Thanks @szymon_dybczak and @BS_THE_ANALYST - This is resolved, appreciate all the support.

szymon_dybczak
Esteemed Contributor III

Great, glad that we were able to help!

BS_THE_ANALYST
Esteemed Contributor

can you mark your own post as a solution as well @dbuser24? (would be useful for the additional steps)

Appreciate you feeding back your findings.Congrats on getting it working.

All the best,
BS

Join Us as a Local Community Builder!

Passionate about hosting events and connecting people? Help us grow a vibrant local community—sign up today to get started!

Sign Up Now