cancel
Showing results for 
Search instead for 
Did you mean: 
Data Engineering
cancel
Showing results for 
Search instead for 
Did you mean: 

Spark Dataframe write to Delta format doesn't create a _delta_log

Ovi
New Contributor III

Hello everyone, 

I have an intermittent issue when trying to create a Delta table for the first time in Databricks: all the data gets converted into parquet at the specified location but the 

_delta_log is not created or, if created, it's left empty, thus the resulted data folder isn't considered to be a Delta table.

The command it's a basic one:

sourceDf.write

.format("delta")

.mode(SaveMode.ErrorIfExists)

.partitionBy("date")

.option("mergeSchema", "true")

.save(deltaLocation)

Thank you,

Ovi 

1 REPLY 1

jose_gonzalez
Moderator
Moderator

Can you list (display) the folder location "deltaLocation"? what files do you see here? have you try to use a new location for testing? do you get the same behavior?

Welcome to Databricks Community: Lets learn, network and celebrate together

Join our fast-growing data practitioner and expert community of 80K+ members, ready to discover, help and collaborate together while making meaningful connections. 

Click here to register and join today! 

Engage in exciting technical discussions, join a group with your peers and meet our Featured Members.