Most python examples show the structure of the foreachBatch method as:
def foreachBatchFunc(batchDF, batchId):
batchDF.createOrReplaceTempView('viewName')
(
batchDF
._jdf.sparkSession()
.sql(
"""
<< merge statement >>
"""
)
._jdf.sparkSession().sql() returns a java object not a dataframe
How do you get access to the results dataframe containing the (affected, inserted, updated, deleted) row counts?