cancel
Showing results forย 
Search instead forย 
Did you mean:ย 
Data Engineering
cancel
Showing results forย 
Search instead forย 
Did you mean:ย 

I have to optimise our initial load, for that I want to perform batch inserts while loading data.

Databricks_Work
New Contributor II

No

4 REPLIES 4

Lakshay
Esteemed Contributor
Esteemed Contributor

If you want to optimize your job, you should identify which part of the job is taking time. You might want to take a look at the spark UI and identify the operations that are taking time.

while writing data from spark dataframe to PostgreSQL table , it is taking time for large tables which approx, contains 10million records

Lakshay
Esteemed Contributor
Esteemed Contributor

How many tasks do you see in the Spark UI for the write operation?

Kaniz
Community Manager
Community Manager

Thank you for posting your question in our community! We are happy to assist you.

To help us provide you with the most accurate information, could you please take a moment to review the responses and select the one that best answers your question?

This will also help other community members who may have similar questions in the future. Thank you for your participation and let us know if you need any further assistance! 
 

Welcome to Databricks Community: Lets learn, network and celebrate together

Join our fast-growing data practitioner and expert community of 80K+ members, ready to discover, help and collaborate together while making meaningful connections. 

Click here to register and join today! 

Engage in exciting technical discussions, join a group with your peers and meet our Featured Members.