cancel
Showing results for 
Search instead for 
Did you mean: 
Data Engineering
Join discussions on data engineering best practices, architectures, and optimization strategies within the Databricks Community. Exchange insights and solutions with fellow data engineers.
cancel
Showing results for 
Search instead for 
Did you mean: 

Forum Posts

JohanRex
by New Contributor II
  • 8753 Views
  • 3 replies
  • 5 kudos

Resolved! IllegalArgumentException: requirement failed: Result for RPC Some(e100cace-3836-4461-8902-80b3744fcb6b) lost, please retry your request.

I'm using databricks connect to talk to a cluster on Azure. When doing a count on a dataframe I sometimes get this error message. Once I've gotten it once I don't seem to be able to get rid of it even if I restart my dev environment. ----------------...

  • 8753 Views
  • 3 replies
  • 5 kudos
Latest Reply
Anonymous
Not applicable
  • 5 kudos

Hi @Johan Rex​ We checked with databricks connect team, this issue can happen when the library is too large to upload, Databricks recommends that you use dbx by Databricks Labs for local development instead of Databricks Connect. Databricks plans no ...

  • 5 kudos
2 More Replies
sparkstreaming
by New Contributor III
  • 8381 Views
  • 4 replies
  • 6 kudos

Resolved! Rest API invocation for databricks notebook fails while invoking from ADF pipeline

In the current implementation a streaming databricks notebook needs to be started based on the configuration passed. Since the rest of databricks notebooks are being invoked by using ADF,it was decided to use ADF for starting these notebooks. Since t...

  • 8381 Views
  • 4 replies
  • 6 kudos
Latest Reply
-werners-
Esteemed Contributor III
  • 6 kudos

@Prasanth KP​ ,clearly, the rest call is invalid. What endpoint do you call?Also do not forget to authenticate.May I ask why you use the REST API instead of the available notebook functionality of ADF?

  • 6 kudos
3 More Replies
findinpath
by Contributor
  • 7514 Views
  • 2 replies
  • 3 kudos

Databricks 2.6.25 JDBC driver can't create tables with `GENERATED` columns

I'm using the Databricks JDBC driver recently made available via Maven:https://mvnrepository.com/artifact/com.databricks/databricks-jdbc/2.6.25While trying to create a table with `GENERATED` columns I receive the following exception:Caused by: java.s...

  • 7514 Views
  • 2 replies
  • 3 kudos
Latest Reply
findinpath
Contributor
  • 3 kudos

I was under the impression that this has been recognised as a BUG and is being handled by Databricks.What do I need to do for reporting the issue officially as a BUG?

  • 3 kudos
1 More Replies
ChristianWuerdi
by New Contributor III
  • 16473 Views
  • 4 replies
  • 5 kudos

Resolved! How can I backup my Databricks instance?

We have a Databricks instance on Azure that has somewhat organically grow with dozens of users and hundreds of notebooks. How do I conveniently backup this env so in case disaster strikes the notebooks aren't lost? The data itself is backed by Azure ...

  • 16473 Views
  • 4 replies
  • 5 kudos
Latest Reply
ChristianWuerdi
New Contributor III
  • 5 kudos

@Kaniz Fatma​ All good thanks, combination of CLI + gradually migrating everything to git is a viable solution

  • 5 kudos
3 More Replies
StephanieAlba
by Databricks Employee
  • 8741 Views
  • 2 replies
  • 5 kudos

Resolved! How to add a select all option in a Databricks SQL parameter? I would like to use a query-based drop-down list.

So I want to create a select all button in a parameter. The actual parameter has around 200 options because of the size of the database. However, if I want a general summary where you can see all the options I would have to select one by one and that...

  • 8741 Views
  • 2 replies
  • 5 kudos
Latest Reply
StephanieAlba
Databricks Employee
  • 5 kudos

You could add '--- All Stores ---' to your list. Here is the query I would use to populate the drop-down. S.O. answer hereSELECT store as store_name FROM ( Select Distinct store From Table   UNION ALL   SELECT ...

  • 5 kudos
1 More Replies
pantelis_mare
by Contributor III
  • 6625 Views
  • 4 replies
  • 5 kudos

Resolved! Slow imports for concurrent notebooks

Hello all,I have a large number of light notebooks to run so I am taking the concurrent approach launching notebook runs with dbutils.notebook.run in parallel. The more I increase parallelism the more I see the duration of each notebook increasing.I ...

  • 6625 Views
  • 4 replies
  • 5 kudos
Latest Reply
pantelis_mare
Contributor III
  • 5 kudos

Hello @Kaniz Fatma​ yes it is clear.Following some tests on my side using a ***** notebook that all it does is importing stuff and sleeping for 15 secs (so nothing to do with spark) I figured that even with a 32 cores driver, the fatigue point is clo...

  • 5 kudos
3 More Replies
Anonymous
by Not applicable
  • 3108 Views
  • 3 replies
  • 2 kudos

Resolved! JOB API KEEPS SAYING THE JOB IS RUNNING

I have a library that waits until the job goes in the "TERMINATED" / "SKIPPED" state before continuing. It pools the JOB API.Unfortunately, I'm experiencing cases where the job is terminated on the GUI but the API still keeps saying "RUNNING".There i...

  • 3108 Views
  • 3 replies
  • 2 kudos
Latest Reply
Prabakar
Databricks Employee
  • 2 kudos

@Alessio Palma​ could you please provide the API that you are using? Also share some sample output and logs that would help us with some information.

  • 2 kudos
2 More Replies
Serhii
by Contributor
  • 3747 Views
  • 2 replies
  • 6 kudos

Resolved! DBFS FileStore html document not showing in the browser

hello all! I am using the guide https://docs.databricks.com/data/filestore.html to save folder of static html content to the DBFS FileStore directory (as a sub-directory) and have "enable DBFS web browsing" setting on but still I can't view the web p...

  • 3747 Views
  • 2 replies
  • 6 kudos
Latest Reply
Prabakar
Databricks Employee
  • 6 kudos

@Sergii Ivakhno​ In FileStore you can save files, such as images and libraries, that are accessible within HTML and JavaScript when you call displayHTML. However when you try to access the link it will download the file to your local desktop.

  • 6 kudos
1 More Replies
my_community2
by New Contributor III
  • 9840 Views
  • 8 replies
  • 1 kudos

Running notebooks on DataBricks in Azure blowing up all over since morning of Apr 5 (MST). Was there another poor deployment at DataBricks? This reall...

Running notebooks on DataBricks in Azure blowing up all over since morning of Apr 5 (MST). Was there another poor deployment at DataBricks? This really needs to stop. We are running premium DataBricks on Azure and calling notebooks from ADF.10.2 (inc...

image
  • 9840 Views
  • 8 replies
  • 1 kudos
Latest Reply
Prabakar
Databricks Employee
  • 1 kudos

@Maciej G​ try using the below init script to increase the repl timeout.-------------------------------------- #!/bin/bash cat > /databricks/common/conf/set_repl_timeout.conf << EOL {  databricks.daemon.driver.launchTimeout = 150 }EOL----------------...

  • 1 kudos
7 More Replies
mo91
by New Contributor III
  • 6297 Views
  • 4 replies
  • 9 kudos

Resolved! Community edition - RestException: PERMISSION_DENIED: Model Registry is not enabled for organization 2183541758974102.

Currently running this cmmd:-model_name = "Quality"model_version = mlflow.register_model(f"runs:/{run_id}/random_forest_model", model_name)# Registering the model takes a few seconds, so add a small delaytime.sleep(15)however I get this error:-RestEx...

  • 6297 Views
  • 4 replies
  • 9 kudos
Latest Reply
Prabakar
Databricks Employee
  • 9 kudos

@Martin Olowe​ There are certain limitations with the community edition and you do not have this feature there. To use this you need to go with the commercial version of Databricks as mentioned by @Hubert Dudek​ .

  • 9 kudos
3 More Replies
harish_s
by New Contributor II
  • 6893 Views
  • 3 replies
  • 4 kudos

Resolved! Hi, I get the following error when I enable model serving for spacy model via MLFLOW.

+ echo 'GUNICORN_CMD_ARGS=--timeout 63 --workers 4 'GUNICORN_CMD_ARGS=--timeout 63 --workers 4 + mlflow models serve --no-conda -m /tmp/tmp1a4ltdrk/spacymodelv1 -h unix:/tmp/3.sock -p12022/03/01 08:26:37 INFO mlflow.models.cli: Selected backend for f...

  • 6893 Views
  • 3 replies
  • 4 kudos
Latest Reply
Prabakar
Databricks Employee
  • 4 kudos

Hi @Harish S​ this error could happen if the backend services are not updated. Are you doing this test in a PVC environment or a standard workspace?

  • 4 kudos
2 More Replies
rsp334
by New Contributor II
  • 2089 Views
  • 0 replies
  • 3 kudos

Databricks quickstart cloudformation error

Anyone recently encountered the following error in cloudformation stack while attempting to create a databricks quickstart workspace in AWS?[ERROR] 2022-05-17T16:25:35.920Z 6593c6c0-677c-4918-bcb2-0f5fc9a1c482 Exception: An error occurred (AccessDen...

  • 2089 Views
  • 0 replies
  • 3 kudos
Doaa_Rashad
by New Contributor III
  • 12725 Views
  • 7 replies
  • 8 kudos

Resolved! import Github repo into Databricks

I am trying to import some data from a public repo in GitHub so that to use it from my Databricks notebooks.So far I tried to connect my Databricks account with my GitHub as described here, without results though since it seems that GitHub support co...

image.png image.png
  • 12725 Views
  • 7 replies
  • 8 kudos
Latest Reply
User16753725182
Databricks Employee
  • 8 kudos

Hi @Doaa MohamedRashad​ , To access this setting, you must be an Admin.Please check if you have 'Repos' enabled in the Admin Console --> Workspace settings--> Repos. 

  • 8 kudos
6 More Replies
FRG96
by New Contributor III
  • 27625 Views
  • 4 replies
  • 7 kudos

Resolved! How to programmatically get the Spark Job ID of a running Spark Task?

In Spark we can get the Spark Application ID inside the Task programmatically using:SparkEnv.get.blockManager.conf.getAppIdand we can get the Stage ID and Task Attempt ID of the running Task using:TaskContext.get.stageId TaskContext.get.taskAttemptId...

  • 27625 Views
  • 4 replies
  • 7 kudos
Latest Reply
FRG96
New Contributor III
  • 7 kudos

Hi @Gaurav Rupnar​ , I have Spark SQL UDFs (implemented as Scala methods) in which I want to get the details of the Spark SQL query that called the UDF, especially a unique query ID, which in SparkSQL is the Spark Job ID. That's why I wanted a way to...

  • 7 kudos
3 More Replies
Bharat105
by New Contributor
  • 1177 Views
  • 0 replies
  • 0 kudos

Unable to complete signup

I am trying signup on databricks for my organization use . I am unable to complete as i am not receiving any mail.Please help ​

  • 1177 Views
  • 0 replies
  • 0 kudos

Join Us as a Local Community Builder!

Passionate about hosting events and connecting people? Help us grow a vibrant local community—sign up today to get started!

Sign Up Now
Labels