cancel
Showing results for 
Search instead for 
Did you mean: 
Data Engineering
Join discussions on data engineering best practices, architectures, and optimization strategies within the Databricks Community. Exchange insights and solutions with fellow data engineers.
cancel
Showing results for 
Search instead for 
Did you mean: 

Forum Posts

aaronpetry
by New Contributor III
  • 2326 Views
  • 2 replies
  • 3 kudos

%run not printing notebook output when using 'Run All' command

I have been using the %run command to run auxiliary notebooks from an "orchestration" notebook. I like using %run over dbutils.notebook.run because of the variable inheritance, troubleshooting ease, and the printing of the output from the auxiliary n...

  • 2326 Views
  • 2 replies
  • 3 kudos
Latest Reply
Anonymous
Not applicable
  • 3 kudos

Hi @Aaron Petry​ Great to meet you, and thanks for your question! Let's see if your peers in the community have an answer to your question first. Or else bricksters will get back to you soon. Thanks

  • 3 kudos
1 More Replies
cmilligan
by Contributor II
  • 4615 Views
  • 2 replies
  • 6 kudos

Resolved! How to go up two folders using relative path in %run?

I'm wanting to store a notebook with functions two folders up from the current notebook. I know that I can start the path with ../ to go up one folder but when I've tried .../ it won't go up two folders. Is there a way to do this?

  • 4615 Views
  • 2 replies
  • 6 kudos
Latest Reply
VaibB
Contributor
  • 6 kudos

In order to access a notebook in the current folder use ../notebook_2to go 2 folders up and access (say notebook "secret") use ../../secret

  • 6 kudos
1 More Replies
JamesKuo
by New Contributor III
  • 3678 Views
  • 3 replies
  • 6 kudos

Resolved! Where can I find API documentation to dbutils.notebook.entry_point?

dbutils.notebook.help only lists "run" and "exit" methods. I could only find references to dbutils.notebook.entry_point spread across the web but there does not seem to be an official Databricks API documentation to its complete APIs anywhere. Can so...

  • 3678 Views
  • 3 replies
  • 6 kudos
Latest Reply
Anonymous
Not applicable
  • 6 kudos

Hi @James kuo​ Hope all is well! Just wanted to check in if you were able to resolve your issue and would you be happy to share the solution or mark an answer as best? Else please let us know if you need more help. We'd love to hear from you.Thanks!

  • 6 kudos
2 More Replies
cmilligan
by Contributor II
  • 1904 Views
  • 3 replies
  • 2 kudos

Resolved! Orchestrate run of a folder

I'm needing to run the contents of a folder, which can change over time. Is there a way to set up a notebook that can orchestrate running all notebooks in a folder? My though was if I could retrieve a list of the notebooks I could create a loop to ru...

  • 1904 Views
  • 3 replies
  • 2 kudos
Latest Reply
Hubert-Dudek
Esteemed Contributor III
  • 2 kudos

List all notebooks by making API call and then run them by using dbutils.notebook.run:import requests ctx = dbutils.notebook.entry_point.getDbutils().notebook().getContext() host_name = ctx.tags().get("browserHostName").get() host_token = ctx.apiToke...

  • 2 kudos
2 More Replies
ncouture
by Contributor
  • 2045 Views
  • 6 replies
  • 4 kudos

How to include visualizations returned from %run in the caller notebooks dashboard?

I have a notebook (nb1) that calls another one (nb2) via the %run command. This returns some visualizations that I want to add to a dashboard of the caller notebook (nb1-db). When I select the visualization drop down, then select Add to dashboard, th...

  • 2045 Views
  • 6 replies
  • 4 kudos
Latest Reply
Kaniz
Community Manager
  • 4 kudos

Hi @Nicholas Couture​, We haven’t heard from you since the last response from @Debayan Mukherjee​ , and I was checking back to see if you have a resolution yet. If you have any solution, please share it with the community as it can be helpful to othe...

  • 4 kudos
5 More Replies
VinayEmmadi
by New Contributor
  • 444 Views
  • 0 replies
  • 0 kudos

%run not working as expected

I have a quick question about %run <notebook path>. I am using the %run command to import functions from a notebook. It works fine when I run %run once. But when I run two %run commands, I lose the reference from the first %run. I get NameError when ...

  • 444 Views
  • 0 replies
  • 0 kudos
Bency
by New Contributor III
  • 1094 Views
  • 2 replies
  • 1 kudos

How to get the list of parameters passed from widget

Hi ,Could someone help me understand how I would be able to get all the parameters in the task (from the widget). ie I want to get input as parameter 'Start_Date' , but the case is that this will not always be passed . It could be 'Run_Date' as well ...

  • 1094 Views
  • 2 replies
  • 1 kudos
Latest Reply
Vidula
Honored Contributor
  • 1 kudos

Hi @Bency Mathew​ Hope all is well! Just wanted to check in if you were able to resolve your issue and would you be happy to share the solution or mark an answer as best? Else please let us know if you need more help. We'd love to hear from you.Thank...

  • 1 kudos
1 More Replies
Stephen678
by New Contributor II
  • 899 Views
  • 0 replies
  • 0 kudos

Easy way to debug databricks code. Is there breakpoints in databricks or alternative way to achieve it?

I'm consuming multiple topics from confluent kafka and process each row with business rules using Spark structured streaming (.writestream and .foreach()). While doing that i call other notebook using %run and call the class via foreach while perform...

  • 899 Views
  • 0 replies
  • 0 kudos
RaymondLC92
by New Contributor II
  • 1163 Views
  • 2 replies
  • 1 kudos

Resolved! How to obtain run_id without using dbutils in python?

We would like to be able to get the run_id in a job run and we have the unfortunate restriction that we cannot use dbutils, is there a way to get it in python?I know for Job ID it's possible to retrieve it from the environment variables.

  • 1163 Views
  • 2 replies
  • 1 kudos
Latest Reply
artsheiko
Valued Contributor III
  • 1 kudos

Hi, please refer to the following thread : https://community.databricks.com/s/question/0D58Y00008pbkj9SAA/how-to-get-the-job-id-and-run-id-and-save-into-a-databaseHope this helps

  • 1 kudos
1 More Replies
flachboard
by New Contributor
  • 3153 Views
  • 4 replies
  • 1 kudos

How do you install R packages?

I've tried this, but it doesn't appear to be working: https://community.databricks.com/s/question/0D53f00001GHVX1CAP/unable-to-install-sf-and-rgeos-r-packages-on-the-clusterWhen I run the following after that init script, I receive an error.library(r...

  • 3153 Views
  • 4 replies
  • 1 kudos
Latest Reply
Anonymous
Not applicable
  • 1 kudos

Hey there @Christopher Flach​ Hope all is well! Just wanted to check in if you were able to resolve your issue and would you be happy to share the solution or mark an answer as best? Else please let us know if you need more help. We'd love to hear fr...

  • 1 kudos
3 More Replies
Carneiro
by New Contributor II
  • 5646 Views
  • 3 replies
  • 3 kudos

Resolved! Stuck in "Running Command ..."

Hi, Since yesterday, without a known reason, some commands that used to run daily are now stuck in a "Running command" state. Commands as:dataframe.toPandas() dataframe.show(n=1) dataframe.description() dataframe.write.format("csv").save(location) ge...

  • 5646 Views
  • 3 replies
  • 3 kudos
Latest Reply
jose_gonzalez
Moderator
  • 3 kudos

Hi @Luiz Carneiro​ ,Could you split your Spark's actions into more CMD (paragraphs) and run one at a time to check where it could be taking the extra time. Also, Pandas only runs on your driver. Have you try to use Python or Scala APIs instead? in ca...

  • 3 kudos
2 More Replies
Sugumar_Sriniva
by New Contributor III
  • 4576 Views
  • 12 replies
  • 5 kudos

Resolved! Data bricks cluster creation is failing while running the Cron job scheduling script through init script method from azure data bricks.

Dear connections,I'm unable to run a shell script which contains scheduling a Cron job through init script method on Azure Data bricks cluster nodes.Error from Azure Data bricks workspace:"databricks_error_message": "Cluster scoped init script dbfs:/...

  • 4576 Views
  • 12 replies
  • 5 kudos
Latest Reply
User16764241763
Honored Contributor
  • 5 kudos

Hello @Sugumar Srinivasan​  Could you please enable cluster log delivery and inspect the INIT script logs in the below path dbfs:/cluster-logs/<clusterId>/init_scripts path.https://docs.databricks.com/clusters/configure.html#cluster-log-delivery-1

  • 5 kudos
11 More Replies
zayeem
by New Contributor
  • 1777 Views
  • 1 replies
  • 3 kudos

Resolved! Databricks - Jobs Last run date

Is there a way to get the last run date of job(s) ? I am trying to compile a report and trying to see if this output exists either in databricks jobs cli output or via api?

  • 1777 Views
  • 1 replies
  • 3 kudos
Latest Reply
AmanSehgal
Honored Contributor III
  • 3 kudos

Sure. Using Databricks jobs API you can get this information.Use the following API endpoint to get list of all the jobs and their executions till date in descending order.You can pass job_id as parameter to get runs of a specific job.https://<databri...

  • 3 kudos
wpenfold
by New Contributor II
  • 22250 Views
  • 5 replies
  • 2 kudos
  • 22250 Views
  • 5 replies
  • 2 kudos
Latest Reply
AmanSehgal
Honored Contributor III
  • 2 kudos

Using workspace API you can list out all the notebooks for a given user.The API response will tell you if the objects under the path is a folder or a notebook. If it's a folder then you can add it to the path and get notebooks within the folder.Put a...

  • 2 kudos
4 More Replies
Labels