Options
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
12-09-2024 09:12 AM - edited 12-09-2024 09:13 AM
Hi @oliverw Thanks for your question!
Could you please confirm wether you can find in the Driver logs wrapping the time range when the "0" metrics are reported, the following log line:
Could not report metrics as number leaves in trigger logical plan did not match that of the execution plan
If the above is your case, you can always disable the optimization via the configs:
spark.conf.set("spark.databricks.optimizer.reuseExchangeAndSubquery", "false")
spark.conf.set("spark.sql.exchange.reuse", "false")
However there may be performance regression if you disable these, therefore the suggestion is to first try in a lower environment if possible, if not in-place, but monitor and then revert it back if required.