cancel
Showing results for 
Search instead for 
Did you mean: 
Data Engineering
Join discussions on data engineering best practices, architectures, and optimization strategies within the Databricks Community. Exchange insights and solutions with fellow data engineers.
cancel
Showing results for 
Search instead for 
Did you mean: 

Forum Posts

cmilligan
by Contributor II
  • 764 Views
  • 1 replies
  • 1 kudos

Return notebook path from job that is run remotely from the repo

I'm wanting to set up some email alerts for issues in the data as a part of a job run. I am wanting to point the user to the notebook that the issue occurred in. I think this would be simple enough but another layer is that the job is going to be run...

  • 764 Views
  • 1 replies
  • 1 kudos
Latest Reply
Debayan
Esteemed Contributor III
  • 1 kudos

Hi, Could you please clarify what do you mean by return the file from the remote repo?Please tag @Debayan​ with your next response which will notify me, Thank you!

  • 1 kudos
jgrgn
by New Contributor
  • 905 Views
  • 0 replies
  • 0 kudos

define notebook path from a parameter

Is there a way to define the notebook path based a parameter from the calling notebook using %run? I am aware of dbutils.notebook.run(), but would like to have all the functions defined in the reference notebook to be available in the calling noteboo...

  • 905 Views
  • 0 replies
  • 0 kudos
thushar
by Contributor
  • 4285 Views
  • 4 replies
  • 2 kudos

Can we use a variable to mention the path in the %run command

To compile the Python scripts in Azure notebooks, we are using the magic command %run.The first parameter for this command is the notebook path, is it possible to mention that path in a variable (we have to construct this path dynamically during the ...

  • 4285 Views
  • 4 replies
  • 2 kudos
Latest Reply
Kaniz_Fatma
Community Manager
  • 2 kudos

Hi @Thushar R​ , We haven’t heard from you on the last response from @Akash Bhat​ â€‹ , and I was checking back to see if his suggestions helped you. Or else, If you have any solution, please do share that with the community as it can be helpful to oth...

  • 2 kudos
3 More Replies
User16752241457
by New Contributor II
  • 11950 Views
  • 2 replies
  • 2 kudos

How can I programmatically get my notebook path?

I'm writing some code that trains a ML model using MLflow and a given set of hyperparameters. This code is going to be run by several folks on my team and I want to make sure that the experiment that get's created is created in the same directory as ...

  • 11950 Views
  • 2 replies
  • 2 kudos
Latest Reply
User16857281974
Contributor
  • 2 kudos

In Scala the call is dbutils.notebook.getContext.notebookPath.getIn Python the call isdbutils.entry_point.getDbutils().notebook().getContext().notebookPath().getOrElse(None)If you need it in another language, a common practice would be to pass it thr...

  • 2 kudos
1 More Replies
User16790091296
by Contributor II
  • 2662 Views
  • 1 replies
  • 0 kudos

Notebook path can't be in DBFS?

Some of us are working with IDEs and trying to deploy notebooks (.py) files to dbfs. the problem I have noticed is when configuring jobs, those paths are not recognized.notebook_path: If I use this :dbfs:/artifacts/client-state-vector/0.0.0/bootstrap...

  • 2662 Views
  • 1 replies
  • 0 kudos
Latest Reply
User16752239289
Valued Contributor
  • 0 kudos

The issue is that the python file saved under DBFS not as a workspace notebook. When you given /artifacts/client-state vector/0.0.0/bootstrap.py, the workspace will search the notebook(python file in this case) under the folder that under Workspace t...

  • 0 kudos
User16826994223
by Honored Contributor III
  • 1470 Views
  • 1 replies
  • 1 kudos

File path Not recognisable for notebook jobs in DBFS

we are working on IDEs and once code is developed we put the .py file in DBFS and I am uisng that DBFS path to create a job , but I am getting an error dbfs:/artifacts/kg/bootstrap.py. I get the error notebook not found errror.what could be the is...

  • 1470 Views
  • 1 replies
  • 1 kudos
Latest Reply
User16826994223
Honored Contributor III
  • 1 kudos

The actual notebooks that you create are not stored in Data plane but it is stored in but in control plane, you can import the notebooks through import in Databricks UI or using API , The notebook placed in DBFS cannot be used to create a job

  • 1 kudos
Labels