Spark Dataframe write to Delta format doesn't create a _delta_log
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
05-19-2023 08:51 AM
Hello everyone,
I have an intermittent issue when trying to create a Delta table for the first time in Databricks: all the data gets converted into parquet at the specified location but the
_delta_log is not created or, if created, it's left empty, thus the resulted data folder isn't considered to be a Delta table.
The command it's a basic one:
sourceDf.write
.format("delta")
.mode(SaveMode.ErrorIfExists)
.partitionBy("date")
.option("mergeSchema", "true")
.save(deltaLocation)
Thank you,
Ovi
- Labels:
-
Delta Format
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
06-06-2023 11:48 AM
Can you list (display) the folder location "deltaLocation"? what files do you see here? have you try to use a new location for testing? do you get the same behavior?