cancel
Showing results for 
Search instead for 
Did you mean: 
Data Engineering
Join discussions on data engineering best practices, architectures, and optimization strategies within the Databricks Community. Exchange insights and solutions with fellow data engineers.
cancel
Showing results for 
Search instead for 
Did you mean: 

Write Empty Delta file in Datalake

BhagS
New Contributor II

hi all,

Currently, i am trying to write an empty delta file in data lake, to do this i am doing the following:

  • Reading parquet file from my landing zone ( this file consists only of the schema of SQL tables)
df=spark.read.format('parquet').load(landingZonePath)
  • After this, i convert this file into the delta
df.write.format("delta").save(centralizedZonePath)
  • But after checking data lake i see no file
  • image

Note: Parquet file in landingzone, has the schema

1 ACCEPTED SOLUTION

Accepted Solutions

Noopur_Nigam
Databricks Employee
Databricks Employee

Hi @bhagya s​ Since your source file is empty, there is no data file inside the centralizedZonePath directory i.e .parquet file is not created in the target location. However, _delta_log is the transaction log that holds the metadata of the delta format data and has the schema of the table.

You may understand more about transaction logs here :https://databricks.com/discover/diving-into-delta-lake-talks/unpacking-transaction-log

View solution in original post

2 REPLIES 2

Hubert-Dudek
Esteemed Contributor III

@bhagya s​ , File schema is in _delta_log

Noopur_Nigam
Databricks Employee
Databricks Employee

Hi @bhagya s​ Since your source file is empty, there is no data file inside the centralizedZonePath directory i.e .parquet file is not created in the target location. However, _delta_log is the transaction log that holds the metadata of the delta format data and has the schema of the table.

You may understand more about transaction logs here :https://databricks.com/discover/diving-into-delta-lake-talks/unpacking-transaction-log

Connect with Databricks Users in Your Area

Join a Regional User Group to connect with local Databricks users. Events will be happening in your city, and you won’t want to miss the chance to attend and share knowledge.

If there isn’t a group near you, start one and help create a community that brings people together.

Request a New Group