VZLA
Databricks Employee
Databricks Employee

Hi @oliverw Thanks for your question!

Could you please confirm wether you can find in the Driver logs wrapping the time range when the "0" metrics are reported, the following log line:

 

Could not report metrics as number leaves in trigger logical plan did not match that of the execution plan

 

If the above is your case, you can always disable the optimization via the configs:

spark.conf.set("spark.databricks.optimizer.reuseExchangeAndSubquery", "false")
spark.conf.set("spark.sql.exchange.reuse", "false")

However there may be performance regression if you disable these, therefore the suggestion is to first try in a lower environment if possible, if not in-place, but monitor and then revert it back if required.