cancel
Showing results for 
Search instead for 
Did you mean: 
Data Engineering
Join discussions on data engineering best practices, architectures, and optimization strategies within the Databricks Community. Exchange insights and solutions with fellow data engineers.
cancel
Showing results for 
Search instead for 
Did you mean: 

Write data issue

Srajole
New Contributor

My Databricks job is completing successful but my data is not written into the target table, source path is correct, each n every thing is correct, but I am not sure y data is not written into the delta table.

1 REPLY 1

Vidhi_Khaitan
Databricks Employee
Databricks Employee

hi @Srajole ,

There are a bunch of possibilities as to why the data is not being written into the table -

You’re writing to a path different from the table’s storage location, or using a write mode that doesn’t replace data as expected.
spark.sql("DESCRIBE DETAIL my_table").select("location").show(truncate=False)
Ensure the .write.format("delta").save(path) or .saveAsTable("my_table") matches that location.
If you use append, check that partitions and filters match what your downstream queries expect.

Your DataFrame has zero rows at the write stage (e.g., filters remove all rows, or join keys don’t match). Could you do a simple count on the dataframe before actually writing to a table.

If the target Delta table is partitioned and you’re writing with dynamic partition overwrite or partition filters, no partitions may match. Make sure the partitions match.

Overwriting a partitioned table without specifying overwriteSchema may drop existing data but not write the new batch if the partition columns mismatch.

Hope this helps!