Hello everyone ,
We currently have 2 streaming (Bronze job) created on 2 tasks in the same job, running the same compute job and both merge data into the same table (Silver table). If I create it like above, sometimes I get an error related to "insert concurrent" because Delta Lake has blocked it.
But when I declare both streaming in the same task, the error does not occur. I have declared both streaming in the same task (file). brz1 = df1.readStream...start() brz2 = df2.readStream...start()
I hope someone can help me why when I create two streaming in the same task the "insert concurrent" error does not occur