cancel
Showing results for 
Search instead for 
Did you mean: 
Data Engineering
Join discussions on data engineering best practices, architectures, and optimization strategies within the Databricks Community. Exchange insights and solutions with fellow data engineers.
cancel
Showing results for 
Search instead for 
Did you mean: 

Forum Posts

ckwan48
by New Contributor III
  • 16787 Views
  • 6 replies
  • 3 kudos

Resolved! How to prevent my cluster to shut down after inactivity

Currently, I am running a cluster that is set to terminate after 60 minutes of inactivity. However, in one of my notebooks, one of the cells is still running. How can I prevent this from happening, if want my notebook to run overnight without monito...

  • 16787 Views
  • 6 replies
  • 3 kudos
Latest Reply
AmanSehgal
Honored Contributor III
  • 3 kudos

If a cell is already running ( I assume it's a streaming operation), then I think it doesn't mean that the cluster is inactive. The cluster should be running if a cell is running on it.On the other hand, if you want to keep running your clusters for ...

  • 3 kudos
5 More Replies
dnchankov
by New Contributor II
  • 5112 Views
  • 3 replies
  • 2 kudos

Why my notebook I created in a Repo can be opened safe?

I've cloned a Repo during "Get Started with Data Engineering on Databricks".Then I'm trying to run another notebook from a cell with a magic %run command.But I get that the file can't be opened safe.Here my code:notebook_aname = "John" print(f"Hello ...

  • 5112 Views
  • 3 replies
  • 2 kudos
Latest Reply
petermeissner
New Contributor II
  • 2 kudos

It could be that you need to put the %run in a cell all by itself. Suggested here: https://stackoverflow.com/a/72833400/1144966

  • 2 kudos
2 More Replies
lei_armstrong
by New Contributor II
  • 9585 Views
  • 6 replies
  • 6 kudos

Resolved! Executing Notebooks - Run All Cells vs Run All Below

Due to dependencies, if one of our cells errors then we want the notebook to stop executing.We've noticed some odd behaviour when executing notebooks depending on if "Run all cells in this notebook" is selected from the header versus "Run All Below"....

  • 9585 Views
  • 6 replies
  • 6 kudos
Latest Reply
sukanya09
New Contributor II
  • 6 kudos

Has this been implemented? I have created a job using notebook. My notebook has 6 cells and if the code in first cell fails it should not run the rest of the cells 

  • 6 kudos
5 More Replies
KendraVant
by New Contributor II
  • 13890 Views
  • 7 replies
  • 2 kudos

Resolved! How do I clear all output results in a notebook?

I'm building notebooks for tutorial sessions and I want to clear all the output results from the notebook before distributing it to the participants. This functionality exists in Juypter but I can't find it in Databricks. Any pointers?

  • 13890 Views
  • 7 replies
  • 2 kudos
Latest Reply
holly
Databricks Employee
  • 2 kudos

Yes! Run > â€ƒClear >  Clear all cell outputs Fun fact, this feature was made ~10 years ago when we realised all our customer demos looked very messy and had lots of spoilers in them!

  • 2 kudos
6 More Replies
al_joe
by Contributor
  • 9308 Views
  • 5 replies
  • 3 kudos

Resolved! Split a code cell at cursor position? Add a cell above/below?

In JupyterLab notebooks, we can --In edit mode, you can press Ctrl+Shift+Minus to split the current cell into two at the cursor position In command mode, you can click A or B to add a cell Above or Below the current cellare there equivalent shortcuts...

  • 9308 Views
  • 5 replies
  • 3 kudos
Latest Reply
DavidKxx
Contributor
  • 3 kudos

What's the status of the ctrl-alt-minus shortcut for splitting a cell?  That keyboard combination does absolutely nothing in my interface (running Databricks via Chrome on GCP).

  • 3 kudos
4 More Replies
hanspetter
by New Contributor III
  • 52126 Views
  • 19 replies
  • 4 kudos

Resolved! Is it possible to get Job Run ID of notebook run by dbutils.notbook.run?

When running a notebook using dbutils.notebook.run from a master-notebook, an url to that running notebook is printed, i.e.: Notebook job #223150 Notebook job #223151 Are there any ways to capture that Job Run ID (#223150 or #223151)? We have 50 or ...

  • 52126 Views
  • 19 replies
  • 4 kudos
Latest Reply
Rodrigo_Mohr
New Contributor II
  • 4 kudos

I know this is an old thread, but sharing what is working for me well in Python now, for retrieving the run_id as well and building the entire link to that job run:job_id = dbutils.notebook.entry_point.getDbutils().notebook().getContext().jobId().get...

  • 4 kudos
18 More Replies
MCosta
by New Contributor III
  • 10614 Views
  • 10 replies
  • 19 kudos

Resolved! Debugging!

Hi ML folks, We are using Databricks to train deep learning models. The code, however, has a complex structure of classes. This would work fine in a perfect bug-free world like Alice in Wonderland. Debugging in Databricks is awkward. We ended up do...

  • 10614 Views
  • 10 replies
  • 19 kudos
Latest Reply
petern
New Contributor II
  • 19 kudos

Has this been solved yet; a mature way to debug code on databricks. I'm running in the same kind of issue.Variable explorer can be used and pdb, but not the same really..

  • 19 kudos
9 More Replies
Erik
by Valued Contributor III
  • 10847 Views
  • 4 replies
  • 3 kudos

Resolved! How to run code-formating on the notebooks

Has anyone found a nice way to run code-formating (like black) on the notebooks **in the workspace**? My current workflow is to commit the file, pull it locally, format, repush and pull. It would be nice if it was some relatively easy way to run blac...

  • 10847 Views
  • 4 replies
  • 3 kudos
Latest Reply
MartinPlay01
New Contributor II
  • 3 kudos

Hi Erik,I don't know if you are aware of this feature, currently there is an option to format the code in your databricks notebooks using the black code style formatter.Just you need to either have a version of your DBR equal to or greater than 11.2 ...

  • 3 kudos
3 More Replies
William_Scardua
by Valued Contributor
  • 1402 Views
  • 1 replies
  • 0 kudos

REPOS change my notebook format

Hi guys,I have some notebooks with REPOS but I noticed that REPOS changed my notebook format to .py because of this my Azure Data Factory no longer recognizes the notebook (.py)Have any ideia to convert that .py to databricks format ?

Screenshot 2023-05-30 at 20.39.02
  • 1402 Views
  • 1 replies
  • 0 kudos
Latest Reply
-werners-
Esteemed Contributor III
  • 0 kudos

that is odd. repos is merely another location (linked to git).You can copy/paste the code inside the py file into a notebook, or convert them using online tools or python libraries (like py2ipynb).

  • 0 kudos
pauloquantile
by New Contributor III
  • 4607 Views
  • 8 replies
  • 0 kudos

Resolved! Disable scheduling of notebooks

Hi,We are wondering if it is possible to disable the possibility to disable scheduling of a notebook. A client wants to allow many analysts access to databricks, but a concern is the possibility of setting schedules (the fastest is every minute!). Is...

  • 4607 Views
  • 8 replies
  • 0 kudos
Latest Reply
Anonymous
Not applicable
  • 0 kudos

Hi @Paulo Rijnberg​ Thank you for your question! To assist you better, please take a moment to review the answer and let me know if it best fits your needs.Please help us select the best solution by clicking on "Select As Best" if it does.Your feedba...

  • 0 kudos
7 More Replies
deep_thought
by Contributor
  • 20486 Views
  • 16 replies
  • 9 kudos

Resolved! Schedule job to run sequentially after another job

Is there a way to schedule a job to run after some other job is complete?E.g. Schedule Job A, then upon it's completion run Job B.

  • 20486 Views
  • 16 replies
  • 9 kudos
Latest Reply
claytonseverson
Databricks Employee
  • 9 kudos

Here is the User Guide for Jobs-as-Tasks - https://docs.google.com/document/d/1OJsc-g7IwAJjYooCp7T01Rxyt_xFkMPjmAAGdDGPkY4/edit#heading=h.oudvb5fyfd0n

  • 9 kudos
15 More Replies
cblock
by New Contributor III
  • 2146 Views
  • 3 replies
  • 3 kudos

Unable to run jobs with git notebooks

So, in this case our jobs are deployed from our development workspace to our isolated testing workspace via an automated Azure DevOps pipeline. As such, they are created (and thus run as) a service account user.Recently we made the switch to using gi...

  • 2146 Views
  • 3 replies
  • 3 kudos
Latest Reply
Anonymous
Not applicable
  • 3 kudos

Hi @Chris Block​ Hope all is well! Just wanted to check in if you were able to resolve your issue and would you be happy to share the solution or mark an answer as best? Else please let us know if you need more help. We'd love to hear from you.Thanks...

  • 3 kudos
2 More Replies
__Databricks_Su
by Contributor
  • 98847 Views
  • 17 replies
  • 20 kudos
  • 98847 Views
  • 17 replies
  • 20 kudos
Latest Reply
luis_herrera
Databricks Employee
  • 20 kudos

To pass arguments/variables to a notebook, you can use a JSON file to temporarily store the arguments and then pass it as one argument to the notebook. After passing the JSON file to the notebook, you can parse it with json.loads(). The argument list...

  • 20 kudos
16 More Replies
gdev
by New Contributor
  • 6473 Views
  • 6 replies
  • 3 kudos

Resolved! Migrate notebooks and workflows and others .

I want to move notebooks , workflows , data from one users to another user in Azure Databricks. We move have access to that databricks. Is it possible? If, yes. How to move it.

  • 6473 Views
  • 6 replies
  • 3 kudos
Latest Reply
deedstoke
New Contributor II
  • 3 kudos

Hope all is well!

  • 3 kudos
5 More Replies
Labels