- 2112 Views
- 2 replies
- 3 kudos
I have been using the %run command to run auxiliary notebooks from an "orchestration" notebook. I like using %run over dbutils.notebook.run because of the variable inheritance, troubleshooting ease, and the printing of the output from the auxiliary n...
- 2112 Views
- 2 replies
- 3 kudos
Latest Reply
Hi @Aaron Petry​ Great to meet you, and thanks for your question! Let's see if your peers in the community have an answer to your question first. Or else bricksters will get back to you soon. Thanks
1 More Replies
- 4239 Views
- 2 replies
- 6 kudos
I'm wanting to store a notebook with functions two folders up from the current notebook. I know that I can start the path with ../ to go up one folder but when I've tried .../ it won't go up two folders. Is there a way to do this?
- 4239 Views
- 2 replies
- 6 kudos
Latest Reply
In order to access a notebook in the current folder use ../notebook_2to go 2 folders up and access (say notebook "secret") use ../../secret
1 More Replies
- 3341 Views
- 3 replies
- 6 kudos
dbutils.notebook.help only lists "run" and "exit" methods. I could only find references to dbutils.notebook.entry_point spread across the web but there does not seem to be an official Databricks API documentation to its complete APIs anywhere. Can so...
- 3341 Views
- 3 replies
- 6 kudos
Latest Reply
Hi @James kuo​ Hope all is well! Just wanted to check in if you were able to resolve your issue and would you be happy to share the solution or mark an answer as best? Else please let us know if you need more help. We'd love to hear from you.Thanks!
2 More Replies
- 1689 Views
- 3 replies
- 2 kudos
I'm needing to run the contents of a folder, which can change over time. Is there a way to set up a notebook that can orchestrate running all notebooks in a folder? My though was if I could retrieve a list of the notebooks I could create a loop to ru...
- 1689 Views
- 3 replies
- 2 kudos
Latest Reply
List all notebooks by making API call and then run them by using dbutils.notebook.run:import requests
ctx = dbutils.notebook.entry_point.getDbutils().notebook().getContext()
host_name = ctx.tags().get("browserHostName").get()
host_token = ctx.apiToke...
2 More Replies
- 1810 Views
- 6 replies
- 4 kudos
I have a notebook (nb1) that calls another one (nb2) via the %run command. This returns some visualizations that I want to add to a dashboard of the caller notebook (nb1-db). When I select the visualization drop down, then select Add to dashboard, th...
- 1810 Views
- 6 replies
- 4 kudos
Latest Reply
Hi @Nicholas Couture​, We haven’t heard from you since the last response from @Debayan Mukherjee​ , and I was checking back to see if you have a resolution yet. If you have any solution, please share it with the community as it can be helpful to othe...
5 More Replies
- 386 Views
- 0 replies
- 0 kudos
I have a quick question about %run <notebook path>. I am using the %run command to import functions from a notebook. It works fine when I run %run once. But when I run two %run commands, I lose the reference from the first %run. I get NameError when ...
- 386 Views
- 0 replies
- 0 kudos
- 885 Views
- 0 replies
- 4 kudos
Hi all,I'm trying to run some functions from another notebook (data_process_notebook) in my main notebook, using the %run command command. When I run the command: %run ../path/to/data_process_notebook, it is able to complete successfully, no path, pe...
- 885 Views
- 0 replies
- 4 kudos
by
Bency
• New Contributor III
- 953 Views
- 2 replies
- 1 kudos
Hi ,Could someone help me understand how I would be able to get all the parameters in the task (from the widget). ie I want to get input as parameter 'Start_Date' , but the case is that this will not always be passed . It could be 'Run_Date' as well ...
- 953 Views
- 2 replies
- 1 kudos
Latest Reply
Hi @Bency Mathew​ Hope all is well! Just wanted to check in if you were able to resolve your issue and would you be happy to share the solution or mark an answer as best? Else please let us know if you need more help. We'd love to hear from you.Thank...
1 More Replies
- 829 Views
- 0 replies
- 0 kudos
I'm consuming multiple topics from confluent kafka and process each row with business rules using Spark structured streaming (.writestream and .foreach()). While doing that i call other notebook using %run and call the class via foreach while perform...
- 829 Views
- 0 replies
- 0 kudos
- 972 Views
- 2 replies
- 1 kudos
We would like to be able to get the run_id in a job run and we have the unfortunate restriction that we cannot use dbutils, is there a way to get it in python?I know for Job ID it's possible to retrieve it from the environment variables.
- 972 Views
- 2 replies
- 1 kudos
Latest Reply
Hi, please refer to the following thread : https://community.databricks.com/s/question/0D58Y00008pbkj9SAA/how-to-get-the-job-id-and-run-id-and-save-into-a-databaseHope this helps
1 More Replies
- 2885 Views
- 4 replies
- 1 kudos
I've tried this, but it doesn't appear to be working: https://community.databricks.com/s/question/0D53f00001GHVX1CAP/unable-to-install-sf-and-rgeos-r-packages-on-the-clusterWhen I run the following after that init script, I receive an error.library(r...
- 2885 Views
- 4 replies
- 1 kudos
Latest Reply
Hey there @Christopher Flach​ Hope all is well! Just wanted to check in if you were able to resolve your issue and would you be happy to share the solution or mark an answer as best? Else please let us know if you need more help. We'd love to hear fr...
3 More Replies
- 5380 Views
- 3 replies
- 3 kudos
Hi, Since yesterday, without a known reason, some commands that used to run daily are now stuck in a "Running command" state. Commands as:dataframe.toPandas() dataframe.show(n=1) dataframe.description() dataframe.write.format("csv").save(location) ge...
- 5380 Views
- 3 replies
- 3 kudos
Latest Reply
Hi @Luiz Carneiro​ ,Could you split your Spark's actions into more CMD (paragraphs) and run one at a time to check where it could be taking the extra time. Also, Pandas only runs on your driver. Have you try to use Python or Scala APIs instead? in ca...
2 More Replies
- 3979 Views
- 12 replies
- 5 kudos
Dear connections,I'm unable to run a shell script which contains scheduling a Cron job through init script method on Azure Data bricks cluster nodes.Error from Azure Data bricks workspace:"databricks_error_message": "Cluster scoped init script dbfs:/...
- 3979 Views
- 12 replies
- 5 kudos
Latest Reply
Hello @Sugumar Srinivasan​ Could you please enable cluster log delivery and inspect the INIT script logs in the below path dbfs:/cluster-logs/<clusterId>/init_scripts path.https://docs.databricks.com/clusters/configure.html#cluster-log-delivery-1
11 More Replies
- 1630 Views
- 1 replies
- 3 kudos
Is there a way to get the last run date of job(s) ? I am trying to compile a report and trying to see if this output exists either in databricks jobs cli output or via api?
- 1630 Views
- 1 replies
- 3 kudos
Latest Reply
Sure. Using Databricks jobs API you can get this information.Use the following API endpoint to get list of all the jobs and their executions till date in descending order.You can pass job_id as parameter to get runs of a specific job.https://<databri...