cancel
Showing results forย 
Search instead forย 
Did you mean:ย 
Data Engineering
Join discussions on data engineering best practices, architectures, and optimization strategies within the Databricks Community. Exchange insights and solutions with fellow data engineers.
cancel
Showing results forย 
Search instead forย 
Did you mean:ย 

Out of Memory/Connection Lost When Writing to External SQL Server from Databricks Using JDBC Connection

Megan05
New Contributor III

I am working on writing a large amount of data from Databricks to an external SQL server using a JDB connection. I keep getting timeout errors/connection lost but digging deeper it appears to be a memory problem. I am wondering what cluster configurations I may need/where would be best to cache my data. The input data is about about 60 gb of data that is reduced to 60 mil rows. The process works to write about 1 million rows to the external database then crashes.

I have tried different cluster configurations, memory optimized, compute optimized etc. I have also tried different garbage collection settings as the garbage collection metric is dark red during the process.

1 ACCEPTED SOLUTION

Accepted Solutions

Hubert-Dudek
Esteemed Contributor III

Please extend the number of dataframe partitions using  

coalesce(<N>) or repartition(<N>). In most cases, it should save the issue automatically as it will write in chunks per partition.

In addition these jdbc connection properties can help (as on JDBC To Other Databases - Spark 3.3.0 Documentation (apache.org)๐Ÿ˜ž

numPartitions

batchsize

isolationLevel

View solution in original post

5 REPLIES 5

Hubert-Dudek
Esteemed Contributor III

Please extend the number of dataframe partitions using  

coalesce(<N>) or repartition(<N>). In most cases, it should save the issue automatically as it will write in chunks per partition.

In addition these jdbc connection properties can help (as on JDBC To Other Databases - Spark 3.3.0 Documentation (apache.org)๐Ÿ˜ž

numPartitions

batchsize

isolationLevel

Megan05
New Contributor III

Thanks for your response, Hubert! That seemed to work to fix the timeout issue.

Hubert-Dudek
Esteemed Contributor III

Great to hear. If it is possible, please select my answer as the best one.

Excuse me Megan05, what parameters did you use?

hotrabattecom
New Contributor II

Thanks for the answer. I am also get in this problem.

Hotrabatt

Connect with Databricks Users in Your Area

Join a Regional User Group to connect with local Databricks users. Events will be happening in your city, and you wonโ€™t want to miss the chance to attend and share knowledge.

If there isnโ€™t a group near you, start one and help create a community that brings people together.

Request a New Group