cancel
Showing results for 
Search instead for 
Did you mean: 
Data Engineering
cancel
Showing results for 
Search instead for 
Did you mean: 

Forum Posts

Jfoxyyc
by Valued Contributor
  • 1913 Views
  • 4 replies
  • 0 kudos

Is there a way to catch the cancel button or the interrupt button in a Databricks notebook?

I'm running oracledb package and it uses sessions. When you cancel a running query it doesn't close the session even if you have a try catch block because a cancel or interrupt issues a kill command on the process. Is there a method to catch the canc...

  • 1913 Views
  • 4 replies
  • 0 kudos
Latest Reply
jonathan-dufaul
Valued Contributor
  • 0 kudos

I'm having the same issue and this has been frustrating as heck.

  • 0 kudos
3 More Replies
sgannavaram
by New Contributor III
  • 1480 Views
  • 2 replies
  • 1 kudos

How to connect to IBM MQ from Databricks notebook?

We are trying to connect to IBM MQ and post message to MQ, which eventually consumed by mainframe application.What are the IBM MQ clients .jars / libraries installed in cluster ? if you have any sample code for connectivity that would be helpful.

  • 1480 Views
  • 2 replies
  • 1 kudos
Latest Reply
Saleem
New Contributor II
  • 1 kudos

Kindly update if you are able to connect to MQ from Databricks. I am working on same but no luck as I’m unable to install pymqi library on the cluster as its showing error as MQ library could not be found

  • 1 kudos
1 More Replies
shiv4050
by New Contributor
  • 1912 Views
  • 4 replies
  • 0 kudos

Execute databricks notebook form a python source code.

Hello,I 'm trying to execute databricks notebook form a python source code but getting error.source code below------------------from databricks_api import DatabricksAPI   # Create a Databricks API client api = DatabricksAPI(host='databrick_host', tok...

  • 1912 Views
  • 4 replies
  • 0 kudos
Latest Reply
sewl
New Contributor II
  • 0 kudos

The error you are encountering indicates that there is an issue with establishing a connection to the Databricks host specified in your code. Specifically, the error message "getaddrinfo failed" suggests that the hostname or IP address you provided f...

  • 0 kudos
3 More Replies
jfarmer
by New Contributor II
  • 3256 Views
  • 3 replies
  • 1 kudos

PermissionError / Operation not Permitted with Files-in-Repos

I've been running a notebook using files-in-repo. Previously this has worked fine. I'm unsure what's changed (I was testing integration with DCS on older runtimes, but don't think I made any persistent changes)--but now it's throwing an error (always...

image image
  • 3256 Views
  • 3 replies
  • 1 kudos
Latest Reply
_carleto_
New Contributor II
  • 1 kudos

Hi @jfarmer , did you solved this issue? I'm having exactly the same challenge.Thanks!

  • 1 kudos
2 More Replies
dbickshammer
by New Contributor II
  • 3327 Views
  • 3 replies
  • 4 kudos

Resolved! how can I export dashboard to HTML?

Now I can successfully export notebook view to HTML using job api (run export).However how can I export dashboard view which is generated by the tab of 'show in dashboard view' to HTML? the tab is on the right top of the corner in the cell.I want an ...

  • 3327 Views
  • 3 replies
  • 4 kudos
Latest Reply
kyxam
New Contributor II
  • 4 kudos

Hi @Kaniz ! I am wondering how to export a dashboard tab from a notebook and I found this old topic.I am not able to find the "views_to_export" parameter that @Hubert-Dudek refers to in docs. May the docs have been updated and now the parameter is ca...

  • 4 kudos
2 More Replies
soumyaPattnaik
by New Contributor III
  • 1935 Views
  • 4 replies
  • 6 kudos

How can I customize the Notebook Job # while using dbutils.notebook.run method?

When running multiple notebooks parallelly using dbutils.notebook.run from a parent notebook, an url to that running notebook is printed, like belowNotebook job #211371132480519Is there a way I can print the notebook name or some customized string in...

  • 1935 Views
  • 4 replies
  • 6 kudos
Latest Reply
soumyaPattnaik
New Contributor III
  • 6 kudos

Hi @Debayan Thank you for your reply.However, the answer I am looking for is : how to print/get a more meaningful name of the jobs when running multiple notebooks parallelly using dbutils.notebook.run from a parent notebook.Now in the parent notebook...

  • 6 kudos
3 More Replies
Prank
by New Contributor III
  • 3120 Views
  • 11 replies
  • 8 kudos
  • 3120 Views
  • 11 replies
  • 8 kudos
Latest Reply
BilalAslamDbrx
Honored Contributor II
  • 8 kudos

@Prank  why do you want the browser hostname?

  • 8 kudos
10 More Replies
mjbobak
by New Contributor III
  • 11721 Views
  • 5 replies
  • 9 kudos

Resolved! How to import a helper module that uses databricks specific modules (dbutils)

I have a main databricks notebook that runs a handful of functions. In this notebook, I import a helper.py file that is in my same repo and when I execute the import everything looks fine. Inside my helper.py there's a function that leverages built-i...

  • 11721 Views
  • 5 replies
  • 9 kudos
Latest Reply
amitca71
Contributor II
  • 9 kudos

Hi,i 'm facing similiar issue, when deploying via dbx.I have an helper notebook, that when executing it via jobs works fine (without any includes)while i deploy it via dbx (to same cluster), the helper notebook results withdbutils.fs.ls(path)NameEr...

  • 9 kudos
4 More Replies
Chaitanya_Raju
by Honored Contributor
  • 2534 Views
  • 7 replies
  • 0 kudos
  • 2534 Views
  • 7 replies
  • 0 kudos
Latest Reply
Vartika
Moderator
  • 0 kudos

Hi @Ratna Chaitanya Raju Bandaru​Just wanted to check in if you were able to resolve your issue. If yes, would you be happy to mark an answer as best? If not, please tell us so we can help you.Thanks!

  • 0 kudos
6 More Replies
User16869510359
by Esteemed Contributor
  • 2927 Views
  • 2 replies
  • 3 kudos

Resolved! Can I install notebook scoped JAR/Maven libraries?

The notebook scoped libraries are very handy. Is it possible to leverage the same for maven jars or application jars as well?

  • 2927 Views
  • 2 replies
  • 3 kudos
Latest Reply
Pratik_Ghosh
New Contributor II
  • 3 kudos

Any further update on this topic?

  • 3 kudos
1 More Replies
Volkan_Gumuskay
by New Contributor III
  • 2799 Views
  • 6 replies
  • 3 kudos

Resolved! Is there a way to run a single or selected lines in a notebook?

Assume we have a given cellprint('A') print('B') print('C')I want to run only the below line.print('B')Obviously, I can seperate the cell into three and run the one I want, but this is timely. This is a feature I use so often (e.g. in pycharm) and wo...

  • 2799 Views
  • 6 replies
  • 3 kudos
Latest Reply
Tharun-Kumar
Honored Contributor II
  • 3 kudos

@Volkan_Gumuskay This is also available as an option in the notebook run options.

  • 3 kudos
5 More Replies
chandan_a_v
by Valued Contributor
  • 4736 Views
  • 3 replies
  • 6 kudos

How to restart the Spark session within the notebook without reattaching the notebook?

Hi All,I want to run an ETL pipeline in a sequential way in my DB notebook. If I run it without resetting the Spark session or restarting the cluster I am getting a data frame key error. I think this might be because of the Spark cache because If I r...

  • 4736 Views
  • 3 replies
  • 6 kudos
Latest Reply
g_krilis
New Contributor II
  • 6 kudos

Is there a solution to the above problem? I also would like to restart SparkSession to free my cluster's resources, but when callingspark.stop()the notebook automatically detach and the following error occurs:The spark context has stopped and the dri...

  • 6 kudos
2 More Replies
Paddy_chu
by New Contributor
  • 11050 Views
  • 1 replies
  • 0 kudos

How to restart the kernel on my notebook in databricks?

while installing a python package on my databricks notebook, I kept getting a message saying that: "Note: you may need to restart the kernel using dbutils.library.restartPython() to use updated packages."I've tried restarting my cluster, also detach ...

error message
  • 11050 Views
  • 1 replies
  • 0 kudos
Latest Reply
Evan_MCK
Contributor
  • 0 kudos

dbutils.library.restartPython()Just run this code in the notebook without restarting the cluster or using pip install again. Restarting the cluster erased what you just installed with pip and you are back to square one. Restarting python after the pi...

  • 0 kudos
Silán
by New Contributor II
  • 893 Views
  • 3 replies
  • 4 kudos

Resolved! Kept outputs

Hi everyone,I was wondering if perhaps someone of you could tell me which kinds of outputs are kept in a notebook after the cluster to which it is attached is terminated... Actually, I am asking it especially because I lost some visualization that I ...

  • 893 Views
  • 3 replies
  • 4 kudos
Latest Reply
Silán
New Contributor II
  • 4 kudos

Great. Thanks a lot.

  • 4 kudos
2 More Replies
elikvar
by New Contributor III
  • 6618 Views
  • 9 replies
  • 9 kudos

Cluster occasionally fails to launch

I have a daily running notebook that occasionally fails with the error:"Run result unavailable: job failed with error message Unexpected failure while waiting for the cluster Some((xxxxxxxxxxxxxxx) )to be readySome(: Cluster xxxxxxxxxxxxxxxx is in un...

  • 6618 Views
  • 9 replies
  • 9 kudos
Latest Reply
Lebreton
New Contributor II
  • 9 kudos

hello any update on this issue ?We have the same problem and no logs to investigate (even in dbfs when we activate the logging)Unexpected failure while waiting for the cluster (<id of our cluster>) to be ready: Cluster <id of our cluster> is in unexp...

  • 9 kudos
8 More Replies
Labels