hi all,
Currently, i am trying to write an empty delta file in data lake, to do this i am doing the following:
- Reading parquet file from my landing zone ( this file consists only of the schema of SQL tables)
df=spark.read.format('parquet').load(landingZonePath)
- After this, i convert this file into the delta
df.write.format("delta").save(centralizedZonePath)
- But after checking data lake i see no file
Note: Parquet file in landingzone, has the schema