- 1032 Views
- 1 replies
- 1 kudos
I'm wanting to set up some email alerts for issues in the data as a part of a job run. I am wanting to point the user to the notebook that the issue occurred in. I think this would be simple enough but another layer is that the job is going to be run...
- 1032 Views
- 1 replies
- 1 kudos
Latest Reply
Hi, Could you please clarify what do you mean by return the file from the remote repo?Please tag @Debayan with your next response which will notify me, Thank you!
by
jgrgn
• New Contributor
- 1203 Views
- 0 replies
- 0 kudos
Is there a way to define the notebook path based a parameter from the calling notebook using %run? I am aware of dbutils.notebook.run(), but would like to have all the functions defined in the reference notebook to be available in the calling noteboo...
- 1203 Views
- 0 replies
- 0 kudos
- 5217 Views
- 3 replies
- 2 kudos
To compile the Python scripts in Azure notebooks, we are using the magic command %run.The first parameter for this command is the notebook path, is it possible to mention that path in a variable (we have to construct this path dynamically during the ...
- 5217 Views
- 3 replies
- 2 kudos
Latest Reply
@Thushar R I don't think it is possible to pass the notebook path in a variable and run it with a %run.I believe you can make use of notebook workflows. Notebook workflows are a complement to %runhttps://docs.databricks.com/notebooks/notebook-workfl...
2 More Replies
- 14924 Views
- 2 replies
- 2 kudos
I'm writing some code that trains a ML model using MLflow and a given set of hyperparameters. This code is going to be run by several folks on my team and I want to make sure that the experiment that get's created is created in the same directory as ...
- 14924 Views
- 2 replies
- 2 kudos
Latest Reply
In Scala the call is dbutils.notebook.getContext.notebookPath.getIn Python the call isdbutils.entry_point.getDbutils().notebook().getContext().notebookPath().getOrElse(None)If you need it in another language, a common practice would be to pass it thr...
1 More Replies
- 3037 Views
- 1 replies
- 0 kudos
Some of us are working with IDEs and trying to deploy notebooks (.py) files to dbfs. the problem I have noticed is when configuring jobs, those paths are not recognized.notebook_path: If I use this :dbfs:/artifacts/client-state-vector/0.0.0/bootstrap...
- 3037 Views
- 1 replies
- 0 kudos
Latest Reply
The issue is that the python file saved under DBFS not as a workspace notebook. When you given /artifacts/client-state vector/0.0.0/bootstrap.py, the workspace will search the notebook(python file in this case) under the folder that under Workspace t...
- 1799 Views
- 1 replies
- 1 kudos
we are working on IDEs and once code is developed we put the .py file in DBFS and I am uisng that DBFS path to create a job , but I am getting an error dbfs:/artifacts/kg/bootstrap.py. I get the error notebook not found errror.what could be the is...
- 1799 Views
- 1 replies
- 1 kudos
Latest Reply
The actual notebooks that you create are not stored in Data plane but it is stored in but in control plane, you can import the notebooks through import in Databricks UI or using API , The notebook placed in DBFS cannot be used to create a job