- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
04-07-2022 01:43 AM
Hi All,
I am trying to Partition By () on Delta file in pyspark language and using command:
df.write.format("delta").mode("overwrite").option("overwriteSchema","true").partitionBy("Partition Column").save("Partition file path") -- It doesnt seems to work for me.
df.write.option("header",True).partitionBy("Partition Column").mode("overwrite").parquet("Partition file path") -- it worked but in the further steps it complains about the file type is not delta.
Please suggest the code to save partition file in delta format.
Thanks in advance.
- Labels:
-
Azure databricks
-
Delta
-
Delta Files
-
Pyspark
Accepted Solutions
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
04-07-2022 02:35 AM
.save("Partition file path") - it should be a folder path.
Additionally what runtime are you using? It was like that long before spark 3.x check it with command
sc.version
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
04-07-2022 02:35 AM
.save("Partition file path") - it should be a folder path.
Additionally what runtime are you using? It was like that long before spark 3.x check it with command
sc.version
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
04-07-2022 03:52 AM
Thanks @Hubert Dudek it worked.
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
04-07-2022 04:48 AM
Haven't helped much, but I would appreciate if you select my answer as the best one 🙂
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
04-26-2022 09:36 AM
Hey @Harsha kriplani
Hope you are well. Thank you for posting in here. It is awesome that you found a solution. Would you like to mark Hubert's answer as best? It would be really helpful for the other members too.
Cheers!

