cancel
Showing results forย 
Search instead forย 
Did you mean:ย 
Data Engineering
Join discussions on data engineering best practices, architectures, and optimization strategies within the Databricks Community. Exchange insights and solutions with fellow data engineers.
cancel
Showing results forย 
Search instead forย 
Did you mean:ย 

I have to optimise our initial load, for that I want to perform batch inserts while loading data.

Databricks_Work
New Contributor II

No

4 REPLIES 4

Lakshay
Esteemed Contributor
Esteemed Contributor

If you want to optimize your job, you should identify which part of the job is taking time. You might want to take a look at the spark UI and identify the operations that are taking time.

while writing data from spark dataframe to PostgreSQL table , it is taking time for large tables which approx, contains 10million records

Lakshay
Esteemed Contributor
Esteemed Contributor

How many tasks do you see in the Spark UI for the write operation?

Kaniz_Fatma
Community Manager
Community Manager

Thank you for posting your question in our community! We are happy to assist you.

To help us provide you with the most accurate information, could you please take a moment to review the responses and select the one that best answers your question?

This will also help other community members who may have similar questions in the future. Thank you for your participation and let us know if you need any further assistance! 
 

Connect with Databricks Users in Your Area

Join a Regional User Group to connect with local Databricks users. Events will be happening in your city, and you wonโ€™t want to miss the chance to attend and share knowledge.

If there isnโ€™t a group near you, start one and help create a community that brings people together.

Request a New Group