3 weeks ago
result = dbutils.notebooks.run("/Workspace/YourFolder/NotebookA",
timeout_seconds=600, arguments={"param1": "value1"})
print(result)
I was able to execute the above code manually from a notebook.
But when i run the same notebook as a job, it fails stating that the path not found.
Complete code is checked-in to repo and job reads the source from repo.
But I'm still unclear as I have mentioned the absolute path of the workspace notebookA.
3 weeks ago
Hello @siva_pusarla ,
As per official docs, the โabsolute pathโ examples for running/importing notebooks are like:
So /Workspace/YourFolder/NotebookA is often not a valid notebook object path, unless that exact path exists as a notebook path in your workspace.
Also, when the job is configured to run from a remote Git repository, the notebook being executed is an ephemeral checkout, and paths behave differently between Git folders and workspace folders. Databricks even calls out that Git-folder path behaviour can differ from workspace-folder behaviour.
Please refer to doc: https://kb.databricks.com/libraries/paths-behave-differently-on-git-folders-and-workspace-folders
So in the job run, Databricks tries to resolve /Workspace/YourFolder/NotebookA as a notebook path and canโt find it and hence throws the pathNotFoundError
You need to refer to this doc for best practices to use Notebooks: https://docs.databricks.com/aws/en/notebooks/notebook-workflows
3 weeks ago - last edited 3 weeks ago
Hi Anudeep,
thanks for your response..
notebookA is not the part of repo code.. it is a constant file in workspace.
I would like to keep some files local to the workspace(for env setup) irrespective of the repos.
| git_folder (GIT)
| -- module
| ---- app.py
...
| Workspace_folder
| -- Common_Utils
| ---- env_setup.py
env_setup is local to different workspaces - dev,test and prod - hence cannot checkin to repo.
Similar to the above set up i want to run dbutils.notebook.run(/Workspace/Common_Utils/env_setup) from app.py while executing the app through workflow/job
com.databricks.WorkflowException: com.databricks.NotebookExecutionException: FAILED: Unable to access the notebook "Workspace/Common_Utils/env_setup". Either it does not exist, or the identity used to run this job, xxx, lacks the required permissions.
But both the notebook and the permission to the notebook exists and works fine when run outside the job
3 weeks ago
Try to convert env_setup into repo-based code and control behavior via environment
Instead of a workspace notebook, use a Python module in the repo and drive environment differences using:
Job parameters
Branches (dev / test / prod)
Secrets (workspace-specific)
Example repo structure
repo/
โโโ common_utils/
| โโโ env_setup.py
โโโ app.py
Example:
env_setup.py
from databricks.sdk.runtime import dbutils
def load_env(env):
if env == "dev":
return {
"catalog": "dev_catalog",
"password": dbutils.secrets.get("app-secrets", "db-password")
}
if env == "test":
return {
"catalog": "test_catalog",
"password": dbutils.secrets.get("app-secrets", "db-password")
}
if env == "prod":
return {
"catalog": "prod_catalog",
"password": dbutils.secrets.get("app-secrets", "db-password")
}
app.py
from common_utils.env_setup import load_env
from databricks.sdk.runtime import dbutils
env = dbutils.widgets.get("env")
config = load_env(env)
print(f"Running in {env}, catalog = {config['catalog']}")
a week ago
I agree with you approach, but I would like to keep my env set-up away from repo. Limiting the repo only to the application code and env-set up close to the environment where the job runs.
In future, I should be able to run the application in various different environments and every time a change in the env should not update the app code in the repo.
a week ago - last edited a week ago
@siva_pusarla: We use the following pattern and it works,
1) Calling notebook - constant location used by Job.
+ src/framework
+ notebook_executor.py
2) Callee notebooks - dynamic
+ src/app/notebooks
+ notebook1.py
+ notebook2.py
....
+ notebookn.py
Job will send the notebook names as parameter to notebook_executor.py and it will execute notebooks using the below pattern,
notebook_path = f"../app/notebooks/{notebook_name}"
dbutils.notebook.run(notebook_path, TIMEOUT, input_params)The important learning for us is the usage of relative path for notebook execution. Hope it helps
a week ago
Hi Siva, thanks for the response.
Yes we use the relative path for calling the notebooks within the repo.
Here I'm trying to call a notebook which is in the workspace from a job that is being executed from repo.
I have ensured all the required permissions but it says "either the notebook doesnot exists or user does not have permissions to run"
I see this behavior starting from last couple of months in 2025.
Passionate about hosting events and connecting people? Help us grow a vibrant local communityโsign up today to get started!
Sign Up Now