cancel
Showing results for 
Search instead for 
Did you mean: 
Data Engineering
Join discussions on data engineering best practices, architectures, and optimization strategies within the Databricks Community. Exchange insights and solutions with fellow data engineers.
cancel
Showing results for 
Search instead for 
Did you mean: 
Data + AI Summit 2024 - Data Engineering & Streaming

Forum Posts

Rnmj
by New Contributor III
  • 10845 Views
  • 5 replies
  • 7 kudos

ConnectException: Connection refused (Connection refused) This is often caused by an OOM error

I am trying to run a python code where a json file is flattened to pipe separated file . The code works with smaller files but for huge files of 2.4 GB I get below error:ConnectException: Connection refused (Connection refused)Error while obtaining a...

  • 10845 Views
  • 5 replies
  • 7 kudos
Latest Reply
Rnmj
New Contributor III
  • 7 kudos

Hi @Jose Gonzalez​ , @Werner Stinckens​  @Kaniz Fatma​ ,Thanks for your response .Appreciate a lot. The issue was in the code, it was a python /panda code running on Spark. Due to this only driver node was being used. i did validate this by increasin...

  • 7 kudos
4 More Replies
Braxx
by Contributor II
  • 1701 Views
  • 1 replies
  • 3 kudos

Retry api request if fails

I have a simple API request to query a table and retrive data, which are then suited into a dataframe. May happened, it fails due to different reasons. How to retry it for let's say 5 times when any kind of error takes place? Here is an api request:d...

  • 1701 Views
  • 1 replies
  • 3 kudos
Latest Reply
Manoj
Contributor II
  • 3 kudos

@Bartosz Wachocki​ ,Use timeout, retry interval ,recursion and exception handling pseudo code belowtimeout = 300def exec_query(query,timeout): try: df = spark.createDataFrame(sf.bulk.MyTable.query(query)) except: if timeout > 0 : sleep(60) exec_que...

  • 3 kudos
adb-rm
by New Contributor II
  • 1464 Views
  • 2 replies
  • 2 kudos

Resolved! mail configuration azure data bricks pyspark notebook

Hi All,i am new to azure databricks , i am using pyspark .. we need to configure mail alerts when notebook failed or succeeded ..please can some one help me in mail configuration azure data bricks .Thanks

  • 1464 Views
  • 2 replies
  • 2 kudos
Latest Reply
-werners-
Esteemed Contributor III
  • 2 kudos

the easiest way to schedule notebooks in Azure is to use Data Factory.In Data Factory you can schedule the notebooks and define the alerts you want to send.The other option is the one Hubert mentioned.

  • 2 kudos
1 More Replies
dimoobraznii
by New Contributor III
  • 5754 Views
  • 3 replies
  • 9 kudos

databricks-connect' is not recognized as an internal or external command, operable program or batch file on windows

Hello,I've installed databricks-connect on Windows 10:C:\Users\danoshin>pip install -U "databricks-connect==9.1.*" Collecting databricks-connect==9.1.* Downloading databricks-connect-9.1.2.tar.gz (254.6 MB) |████████████████████████████████| 2...

  • 5754 Views
  • 3 replies
  • 9 kudos
Latest Reply
-werners-
Esteemed Contributor III
  • 9 kudos

@Dmitry Anoshin​ , that seems messed up.the best you can do is to remove databricks connect and also to uninstall any pyspark installation.And then follow the installation guide.It should work after following the procedure.I use a Linux VM for this p...

  • 9 kudos
2 More Replies
Greg_Galloway
by New Contributor III
  • 5149 Views
  • 5 replies
  • 3 kudos

Resolved! Use of private endpoints for storage in workspace with EnableNoPublicIP=Yes and VnetInjection=No

We know that Databricks with VNET injection (our own VNET) allows is to connect to ADLS Gen2 over private endpoints. This is what we typically do.We have a customer who created Databricks with EnableNoPublicIP=Yes (secure cluster connectivity) and Vn...

  • 5149 Views
  • 5 replies
  • 3 kudos
Latest Reply
User16871418122
Contributor III
  • 3 kudos

Managed VNET is locked and allows very limited config tuning like VNET peering that too facilitated and needs to be done from Databricks UI. If they want more control on VNET they need to migrate to VNET injected workspace.

  • 3 kudos
4 More Replies
Manoj
by Contributor II
  • 4120 Views
  • 9 replies
  • 8 kudos

Resolved! Is there a way to persist the delta cache even after the cluster restart?

Hi Team, we are planning to connect Power BI directly to Data bricks, however data fetching using direct query isn't giving great performance, though we are using Zorder by and Partition etc.. We decided to use Delta Cache, but the cache tables area ...

  • 4120 Views
  • 9 replies
  • 8 kudos
Latest Reply
Manoj
Contributor II
  • 8 kudos

@Hubert Dudek​ , I got a good news, I agree with @Werner Stinckens​ , SQL End Point is super fast, I have tested for 143 million records with Direct Query from power bi, result returned in 10-12 seconds. Don't even try doing incremental in power bi, ...

  • 8 kudos
8 More Replies
pantelis_mare
by Contributor III
  • 12206 Views
  • 7 replies
  • 5 kudos

Resolved! [SOLVED] maxPartitionBytes ignored?

Hello all!I'm running a simple read noop query where I read a specific partition of a delta table that looks like this:With the default configuration, I read the data in 12 partitions, which makes sense as the files that are more than 128MB are split...

image
  • 12206 Views
  • 7 replies
  • 5 kudos
Latest Reply
ashish1
New Contributor III
  • 5 kudos

AQE doesn't affect the read time partitioning but at the shuffle time. It would be better to run optimize on the delta lake which will compact the files to approx 1 GB each, it would provide better read time performance.

  • 5 kudos
6 More Replies
Nickels
by New Contributor II
  • 1437 Views
  • 4 replies
  • 1 kudos

Resolved! Reply on inline runtime commands

I feel like the answer to this question should be simple, but none the less I'm struggling.I run a python code that prompts me with the following warning:On my local machine, I can accept this through my terminal and my machine do not run out of memo...

image
  • 1437 Views
  • 4 replies
  • 1 kudos
Latest Reply
jose_gonzalez
Moderator
  • 1 kudos

Hi @Nickels Köhling​ ,In Databricks, you will only be able to see the output in the driver logs. If you go to your driver logs, you will be able to see 3 windows that are displaying the output of "stdout", "stderr" and "log4j".If in your code you do ...

  • 1 kudos
3 More Replies
yatharth29
by New Contributor II
  • 2365 Views
  • 1 replies
  • 2 kudos
  • 2365 Views
  • 1 replies
  • 2 kudos
Latest Reply
Sajesh
New Contributor III
  • 2 kudos

Hi @Yatharth Kaushik​ ,You can get the data into a table using the clusters event API: https://docs.databricks.com/dev-tools/api/latest/clusters.html#events

  • 2 kudos
Jreco
by Contributor
  • 3051 Views
  • 3 replies
  • 5 kudos

Resolved! Reference py file from a notebook

Hi All,I'm trying to reference a py file from a notebook following this documentation: Files in repoI downloaded and added the files to my repo and when I try to run the notebook, the modules is not recognized: Any idea why is this happening? Thanks ...

image image
  • 3051 Views
  • 3 replies
  • 5 kudos
Latest Reply
-werners-
Esteemed Contributor III
  • 5 kudos

In this topic you can find some more info:https://community.databricks.com/s/question/0D53f00001Pp5EhCAJThe docs are not that clear.

  • 5 kudos
2 More Replies
Mec_Mec
by New Contributor II
  • 3447 Views
  • 6 replies
  • 4 kudos

Resolved! Copy a script from the current subscription to new subscription

I would like to check if there is a process to copy a script/code or migrate the script from the current subscription of the Azure Databricks - Notebooks to new subscription of Databricks (new notebook).

  • 3447 Views
  • 6 replies
  • 4 kudos
Latest Reply
Mec_Mec
New Contributor II
  • 4 kudos

how quickly move the Databricks notebooks from one account to another?

  • 4 kudos
5 More Replies
Håkon_Åmdal
by New Contributor III
  • 1729 Views
  • 1 replies
  • 1 kudos

Resolved! Incorrect length for `string` returned by the Databricks ODBC driver

Dear Databricks and community,​I have been struggling with a bug related to using golang and the Databricks ODBC driver.​It turns out that `SQLDescribeColW` consequently returns 256 as a length for `string` columns. However, in Spark, strings might b...

  • 1729 Views
  • 1 replies
  • 1 kudos
Latest Reply
User16829050420
New Contributor III
  • 1 kudos

Thanks for posting this issue @Håkon Åmdal​ . We should be able to reproduce and report it to the Magnitude team subsquently.

  • 1 kudos
RasmusOlesen
by New Contributor III
  • 3517 Views
  • 5 replies
  • 1 kudos

Resolved! ciso8601 library stopped installing out of the blue on DB clusters

We have multiple DB clusters (6.4 Extended Support) that have not changed in terms of libs installed or nodes etc. Sudden from one day to the other, after a cluster restart August 7th, they stopped installing ciso8601 lib as they would usually. Anyb...

  • 3517 Views
  • 5 replies
  • 1 kudos
Latest Reply
RasmusOlesen
New Contributor III
  • 1 kudos

Just to close this old qustion:We solved this by switching to a PEP517 free pip install, using the a Global Init Script:/databricks/python/bin/pip install ciso8601 --disable-pip-version-check --no-use-pep517Now it works for us.

  • 1 kudos
4 More Replies
ssm3819
by New Contributor III
  • 3616 Views
  • 3 replies
  • 3 kudos

Please let me know how i can install PyAudio using the Databricks notebook

Hi,i am trying to install the PyAudio package.but i am getting the following error. Collecting pyaudio Using cached PyAudio-0.2.11.tar.gz (37 kB)Building wheels for collected packages: pyaudio Building wheel for pyaudio (setup.py) ... error ERROR: Co...

image.png
  • 3616 Views
  • 3 replies
  • 3 kudos
Latest Reply
-werners-
Esteemed Contributor III
  • 3 kudos

looks like missing dependencies on the server (linux): portaudioThis should be installed:https://stackoverflow.com/questions/48690984/portaudio-h-no-such-file-or-directory

  • 3 kudos
2 More Replies
Join 100K+ Data Experts: Register Now & Grow with Us!

Excited to expand your horizons with us? Click here to Register and begin your journey to success!

Already a member? Login and join your local regional user group! If there isn’t one near you, fill out this form and we’ll create one for you to join!

Labels