Hi @_its_akshaye
Yes—capture it from the DLT event log and derive it directly from the target table’s CDF, then aggregate by time.
Options that work well
Use the DLT event log “rows written” metricsEvery pipeline writes a structured event log to y...
Hi @sandy_123
The "Multiple failures in stage materialization" error at line 120 is caused by a massive shuffle bottleneck
Check the spark UI and try to understand the reason for the failure such as RPC, heartbeat error etc
Primary Issues:
Window...
Hi @s_agarwal
Please find below my findinsg for your query.
Serverless uses cached Unity Catalog metadata
Your UC metadata points to an old Delta version
Regular clusters bypass this cache
Fix by refreshing or forcing UC metadata rewrite
Hi @seefoods
Please find below my findings for your case.
You don’t need (and can’t meaningfully add) any Spark conf to enable availableNow on Databricks Serverless.
Let me explain clearly, and then show what is safe to do in your decorator.
availa...
Hi @mrstevegross
You must match the container image version to the cluster’s DBR version.
Option 1 — Run the job on a DBR 13.3 cluster
Create a compute with DBR 13.3 LTS
Use your container standard:13.3-LTS
This is the correct configuration.
Opti...