cancel
Showing results for 
Search instead for 
Did you mean: 
Data Engineering
Join discussions on data engineering best practices, architectures, and optimization strategies within the Databricks Community. Exchange insights and solutions with fellow data engineers.
cancel
Showing results for 
Search instead for 
Did you mean: 

Longer execution time to write into the SQL server table from Spark Dataframe

Sha_1890
New Contributor III

I have 8gb of XML data loaded into different dataframes, there are two dataframes which has 24 lakh and 82 lakh data to be written to a 2 SQL server tables which is taking so 2 hrs and 5 hrs of time to write it. 

I am using the below cluster configuration

Cluster 

And the python code

df.write.format("jdbc").option("url",        jdbcUrl).partitionBy("C_Code").mode("append").option("dbtable","staging.tablename").option("user", jdbcUsername).option("password", jdbcPassword).save()

please suggest me any other way to lower the execution time.

0 REPLIES 0

Join Us as a Local Community Builder!

Passionate about hosting events and connecting people? Help us grow a vibrant local community—sign up today to get started!

Sign Up Now