โ08-11-2025 01:35 AM
Followed the below documentation to create a ML experiment -
https://docs.databricks.com/aws/en/mlflow/experiments
I created an experiment using the databricks console, then tried running the below code but getting error - getting error - RESOURCE_DOES_NOT_EXIST: Parent directory /Users/<username> does not exist
โ08-11-2025 04:42 AM - edited โ08-11-2025 04:52 AM
I've tried run your code on my sandbox environment and I didn't encounter any issues.
I did following steps:
- (1) In my workspace under my username directory I've created ML folder:
- (2) Next, I went to my target folder (in this case I've create ML directory) and clicked create MfFLow experiment
- (3) Now, I typed in my expirement name. If you leave artificat location, by default databricks will use following one dbfs:/databricks/mlflow-tracking/<experiment-id>
- (4) Now, I created a new notebook and run the code:
Could you check if you did the same kind of steps?
โ08-11-2025 05:30 AM - edited โ08-11-2025 05:32 AM
Thanks @szymon_dybczak for the detailed steps. In addition to the above I had to write the logic to create directory if not present to get it working -
โ08-11-2025 01:38 AM - edited โ08-11-2025 01:38 AM
@dbuser24 I think this is just an innocent oversight.
In your code you have the following:
experiment_name = "/Users/<username>/my_ml_experiment"
I'm assuming you were meant to fill this in and reference a directory that you have available in Databricks, or create one for this exercise. That's why the error is saying "RESOURCE_DOES_NOT_EXIST: Parent directory /Users/<username> does not exist" .. it can't find this location.
All the best,
BS
โ08-11-2025 02:04 AM
@BS_THE_ANALYST Thanks for your response, there exists a directory with my username, I have just used a placeholder <username> due to company privacy policy. I have also tried the Shared direc
โ08-11-2025 02:04 AM
Does ML experiments need some king of setting at the admin level ? As, other members in the team are also getting the same error.
โ08-11-2025 03:05 AM
Hi @dbuser24 ,
What type of cluster do you use? Which access mode and runtime?
โ08-11-2025 03:19 AM
โ08-11-2025 02:42 AM - edited โ08-11-2025 02:43 AM
@dbuser24 perhaps a way to determine this would be trying this in the Databricks Free Edition? This could provide some insight.
By the sounds of the error message, it seems more like a permissioning issue with regard to be able to create/read from a directory.
If you have a Free Edition, could you try one of the following notebooks/guides (they've worked for me previously)
I imported this notebook from this resource: https://docs.databricks.com/aws/en/mlflow/end-to-end-example If you look at the navigation bar on the left hand side of the website, you'll see there's a few out of the box examples you can just import into your environment.
I tried this one: https://docs.databricks.com/aws/en/notebooks/source/mlflow/mlflow-classic-ml-e2e-mlflow-3.html and it's worked perfectly fine in my databricks free environment. I'm not too familiar with all the ML components in Databricks but it's working. All it required was for me to create a Catalog called "main". I didn't want to change all the catalog names in the paths that get referenced haha!
Definitely a really cool resource that I've linked though. I'll be learning ML by examples through that. For what it's worth, below is a picture of the ML Notebook working in my environment:
All the best,
BS
โ08-11-2025 03:20 AM
Thanks @BS_THE_ANALYST - Let me try using the above references, and get back on this. Much appreciated !!
โ08-11-2025 03:34 AM
@BS_THE_ANALYST I imported the notebook - https://docs.databricks.com/aws/en/notebooks/source/mlflow/mlflow-classic-ml-e2e-mlflow-3.html
still, getting the same error - RESOURCE DOES NOT EXISTS at the step :
โ08-11-2025 04:42 AM - edited โ08-11-2025 04:52 AM
I've tried run your code on my sandbox environment and I didn't encounter any issues.
I did following steps:
- (1) In my workspace under my username directory I've created ML folder:
- (2) Next, I went to my target folder (in this case I've create ML directory) and clicked create MfFLow experiment
- (3) Now, I typed in my expirement name. If you leave artificat location, by default databricks will use following one dbfs:/databricks/mlflow-tracking/<experiment-id>
- (4) Now, I created a new notebook and run the code:
Could you check if you did the same kind of steps?
โ08-11-2025 04:49 AM
sure @szymon_dybczak
โ08-11-2025 05:30 AM - edited โ08-11-2025 05:32 AM
Thanks @szymon_dybczak for the detailed steps. In addition to the above I had to write the logic to create directory if not present to get it working -
โ08-11-2025 05:33 AM
Thanks @szymon_dybczak and @BS_THE_ANALYST - This is resolved, appreciate all the support.
โ08-11-2025 05:38 AM
Great, glad that we were able to help!
โ08-11-2025 05:45 AM
can you mark your own post as a solution as well @dbuser24? (would be useful for the additional steps)
Appreciate you feeding back your findings.Congrats on getting it working.
All the best,
BS
Passionate about hosting events and connecting people? Help us grow a vibrant local communityโsign up today to get started!
Sign Up Now