cancel
Showing results for 
Search instead for 
Did you mean: 
Data Engineering
cancel
Showing results for 
Search instead for 
Did you mean: 

Write Empty Delta file in Datalake

BhagS
New Contributor II

hi all,

Currently, i am trying to write an empty delta file in data lake, to do this i am doing the following:

  • Reading parquet file from my landing zone ( this file consists only of the schema of SQL tables)
df=spark.read.format('parquet').load(landingZonePath)
  • After this, i convert this file into the delta
df.write.format("delta").save(centralizedZonePath)
  • But after checking data lake i see no file
  • image

Note: Parquet file in landingzone, has the schema

1 ACCEPTED SOLUTION

Accepted Solutions

Noopur_Nigam
Valued Contributor II
Valued Contributor II

Hi @bhagya s​ Since your source file is empty, there is no data file inside the centralizedZonePath directory i.e .parquet file is not created in the target location. However, _delta_log is the transaction log that holds the metadata of the delta format data and has the schema of the table.

You may understand more about transaction logs here :https://databricks.com/discover/diving-into-delta-lake-talks/unpacking-transaction-log

View solution in original post

4 REPLIES 4

Hubert-Dudek
Esteemed Contributor III

@bhagya s​ , File schema is in _delta_log

Kaniz
Community Manager
Community Manager

Hi @bhagya s​ ​ , Just a friendly follow-up. Do you still need help, or @Hubert Dudek (Customer)​ 's response help you to find the solution? Please let us know.

Noopur_Nigam
Valued Contributor II
Valued Contributor II

Hi @bhagya s​ Since your source file is empty, there is no data file inside the centralizedZonePath directory i.e .parquet file is not created in the target location. However, _delta_log is the transaction log that holds the metadata of the delta format data and has the schema of the table.

You may understand more about transaction logs here :https://databricks.com/discover/diving-into-delta-lake-talks/unpacking-transaction-log

Kaniz
Community Manager
Community Manager

Hi @bhagya s​ ​, We haven’t heard from you on the last response from @Noopur Nigam​ , and I was checking back to see if you have a resolution yet. If you have any solution, please share it with the community as it can be helpful to others. Otherwise, we will respond with more details and try to help.

Welcome to Databricks Community: Lets learn, network and celebrate together

Join our fast-growing data practitioner and expert community of 80K+ members, ready to discover, help and collaborate together while making meaningful connections. 

Click here to register and join today! 

Engage in exciting technical discussions, join a group with your peers and meet our Featured Members.