How to reduce time while loading data into the azure synapse table?
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
02-14-2023 04:19 AM
Hi All,
I just wanted to know if is there any option to reduce time while loading Pyspark Dataframe into the Azure synapse table using Databricks.
like..
I have a pyspark dataframe that has around 40k records and I am trying to load data into the azure synapse table using databricks it is taking almost 1.10 hrs+ to load complete data into the azure table. I am using save mode('overwrite') as per requirements.
Please let me know if any possible solution to reduce time.
Thanks,
Tinendra
- Labels:
-
Azure
-
Data
-
Pyspark Dataframe
-
Time
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
02-14-2023 04:22 AM
Hi @Tinendra Kumar ,
You can increase the DTU in synapse and if possible, use append mode while saving the files that will help you to reduce the time.
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
02-14-2023 04:26 AM
Hi @Ajay Pandey
I don't have any control over the azure side. Could you please tell me if is there any way/option to do this on the spark/databricks side?
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
02-14-2023 05:25 AM
Hi @Tinendra Kumar ,
There is no option to check your permission in databricks.
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
02-14-2023 05:04 AM
have you checked this:
https://learn.microsoft.com/en-us/azure/databricks/archive/azure/synapse-polybase
tbh I do not use Databricks to load data into synapse. I write the data as parquet /delta lake on our data lake, and use ADF to copy to synapse if necessary. this goes pretty fast.
Another option is to use Synapse Serverless or External tables on the parquet files themselves.
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
02-16-2023 09:39 PM
Hi @Tinendra Kumar
Hope all is well! Just wanted to check in if you were able to resolve your issue and would you be happy to share the solution or mark an answer as best? Else please let us know if you need more help.
We'd love to hear from you.
Thanks!

