That's fair.

Anyway this feature is basically backported from spark 3.3.0, but since spark 3.3.0 has not been released yet I cannot use it because my code won't compile, hence my whole development process won't work.

In the meantime I've found a ugly hack (using reflection) that allow me to avoid this issue:

val clazz   = Class.forName("org.apache.spark.sql.streaming.Trigger")
    val method  = clazz.getMethod("AvailableNow")
    val trigger = method.invoke(null).asInstanceOf[Trigger]
 
    val streamWriter = df.writeStream
      .format("delta")
      .options(config.sparkWriteOptions)
      .trigger(trigger)

Anyway I guess that this is something that needs to be addressed somehow, in the future there may be other backported features where this workaround won't work.

View solution in original post