So I have created a delta live table
Which uses spark.sql() to execute a query
And uses df.write.mode(append).insert into
To insert data into the respective table And at the end i return a dumy table
Since this was the requirement So now I have also implemented count checks but still it is inserting the data 5-7 times I can check the same in table history where the pipline triggers and executes it for 7 times even though it has checked to insure if data is present do not insert it is still inserting Can someone help me debug this And how is initialisation doen in delta live table