How can I programmatically get my notebook path?
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
06-10-2021 05:19 AM
I'm writing some code that trains a ML model using MLflow and a given set of hyperparameters. This code is going to be run by several folks on my team and I want to make sure that the experiment that get's created is created in the same directory as the notebook - i.e. if someone clones the notebook into their own user folder, the MLflow experiment should be pointed to their notebooks new location.
- Labels:
-
Notebook
-
Notebook Path
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
06-21-2021 12:38 PM
In Scala you can use the "dbutils.notebook.getContext.notebookPath" function. If needed for Python or R, the only way to share this value would be to use widgets. Run the code with %scala, pass the value into a widget, then read the widget in Python or R.
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
07-30-2021 12:51 PM
In Scala the call is
dbutils.notebook.getContext.notebookPath.get
In Python the call is
dbutils.entry_point.getDbutils().notebook().getContext().notebookPath().getOrElse(None)
If you need it in another language, a common practice would be to pass it through spark config.
Ignoring that we can get the value in Python (as seen above), if you start with a Scala cell like this:
%scala
val path = dbutils.notebook.getContext.notebookPath.get
spark.conf.set("com.whaterver.notebook-path")
And then in the next Python cell (the same concepts applies to R and SQL)
%python
path = spark.conf.get("com.whaterver.notebook-path")
You just want to adjust the configuration parameter's name to be a unique value so as not to accidentally clash with something. For example, we often use "com.databricks.training"

