WATER MARK ERROR WHILE JOINING WITH MULTIPLE STREAM TABLES
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
08-21-2024 03:00 AM
I am creating a ETL pipeline where i am reading multiple stream table into temp tables and at the end am trying to join those tables to get the output feed into another live table. So for that am using below method where i am giving list of tables as parameter to a method. Inside method using for loop am streaming them one by one into temp tables. After that am trying to execute the sql to get the DF for my new live table.
My Python Method : -
Below are my inputs to that method
sources = ["table1","table2","table3","table4",------,"tablen"]
main_table = "parent_table"
schema = "id long , val string"
I am getting below error while running it in pipeline
ERROR:-
Failed to start stream XXXXX in either append mode or complete mode. Append mode error: Stream-stream LeftOuter join between two streaming DataFrame/Datasets is not supported without a watermark in the join keys, or a watermark on the nullable side and an appropriate range condition;
Please help!
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
08-21-2024 06:10 AM
A stream-stream left join needs a watermark.
f.e.:
stream_df = stream_df.withWatermark("timestamp_column", "30 minutes")
joined_df = stream_df.join(other_stream_df, "join_key")
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
08-21-2024 06:17 AM
Thank you for the reply. But in my case am using sql query to read data from those temp tables. So how can i handle the water mark issue in above scenario.
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
08-21-2024 06:25 AM
try spark.read.table(main_table).withWatermark("timestamp_column", "30 minutes").createOrReplaceTempView
In SQL it is also possible using the WATERMARK function.
https://learn.microsoft.com/en-us/azure/databricks/delta-live-tables/stateful-processing
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
08-21-2024 08:52 AM - edited 08-21-2024 08:53 AM
Do i need to water mark it while writing to temp table and in sql statement too
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
08-22-2024 12:13 AM
it is necessary for the join so if the dataframe has a watermark that's enough.
No need to define it multiple times.
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
08-22-2024 01:19 PM
I did put watermark on the data frame but still getting the same error while executing sql
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
08-22-2024 11:16 PM
Sorry, I put the watermark on the non-streaming table. That is wrong of course, the watermark has to be set on the streaming table (source in this case).

