Ashwin_DSA
Databricks Employee
Databricks Employee

Hi @IM_01 ,

Sorry! I may have clicked the wrong button and incorrectly accepted your comment as a solution! I don't know how to revert it. I'll check that separately. 🙂 

When I said "ignore it in logic," I meant... don’t physically drop the column from the source Delta table. Instead, leave the column in the source schema and stop using it downstream (don’t select/join on it, or treat it as always NULL). That way, the streaming read still sees a stable schema and doesn’t error.

In your case, the column was already dropped from the source, so the error is raised before your transformation runs. Adding it back with.. 

df = df.withColumn("delcol", lit(None))

only changes the in‑memory dataframe after the read. It does not fix the fact that the source table’s schema changed incompatibly, so the stream still fails.

With SDP today, you essentially have two supported options:

  1. Restore the column on the source table (same name/type), then run a full refresh of the affected SDP tables so checkpoints are rebuilt.... or
  2. Leave the column dropped and run a full refresh of those SDP streaming tables so they’re rebuilt against the new schema.

Trying to work around this with column mapping / schemaTrackingLocation isn’t supported for SDP‑managed tables, which is why SET_TBLPROPERTIES fails.

Trust that clarifies. 

If this answer resolves your question, could you mark it as “Accept as Solution”? That helps other users quickly find the correct fix.

Regards,
Ashwin | Delivery Solution Architect @ Databricks
Helping you build and scale the Data Intelligence Platform.
***Opinions are my own***