cancel
Showing results for 
Search instead for 
Did you mean: 
Data Engineering
Join discussions on data engineering best practices, architectures, and optimization strategies within the Databricks Community. Exchange insights and solutions with fellow data engineers.
cancel
Showing results for 
Search instead for 
Did you mean: 

Dataframe loses its contents after the write operation to Database.

Krish-685291
New Contributor III

We had working code as below.

print(f"{file_name}Before insert count", datetime.datetime.now(), scan_df_new.count())

print(scan_df_new.show())

scan_20220908120005_10Before insert count 2022-09-14 11:37:15.853588 3

+-------------------+----------+-------------------+--------------------+----------+

|            tran_id|t_store_id|      scan_datetime|         customer_id|updated_by|

+-------------------+----------+-------------------+--------------------+----------+

|1230000000000000004|      4395|2022-09-08 03:00:01|20220816a51cee4264f1|Databricks|

|1230000000000000005|      4394|2022-09-08 02:58:00|20220816a51cee4264f1|Databricks|

|1230000000000000006|      4393|2022-09-08 03:00:04|20220816a51cee4264f1|Databricks|

+-------------------+----------+-------------------+--------------------+----------+

The data frame after the write operation is used for further business logic processing. This was working earlier. But recently we are observing a strange behavior, where in the data in the data frame is getting lost . When wee check the contents, or even the dataframe count its shows empty.

scan_df_new.write.format("jdbc").option("url", jdbcUrl).option("dbtable", scan_table).mode("append").save()

print(f"{file_name}After insert count", datetime.datetime.now(), scan_df_new.count())

print(scan_df_new.show())

None

scan_20220908120005_10After insert count 2022-09-14 11:37:18.372147 0

+-------+----------+-------------+-----------+----------+

|tran_id|t_store_id|scan_datetime|customer_id|updated_by|

+-------+----------+-------------+-----------+----------+

Anything recently changed in the data bricks, which is impacting this?

Any help on this is appreciated.

Thanks

Krishna

0 REPLIES 0

Connect with Databricks Users in Your Area

Join a Regional User Group to connect with local Databricks users. Events will be happening in your city, and you won’t want to miss the chance to attend and share knowledge.

If there isn’t a group near you, start one and help create a community that brings people together.

Request a New Group