Hello @Dhruv-22 ,
No—mergeSchema doesn’t auto-widen an incoming INT column to a table’s BIGINT (nor does it auto-cast). mergeSchema mainly helps add new columns (and historically only a tiny set of numeric upcasts), but it won’t change an existing co...
Hello @anusha98 ,
You’re hitting a real limitation of Structured Streaming: non-time window functions (like row_number() over (...)) aren’t allowed on streaming DFs.
You need to use agg().max() to get the “latest value per key”
@dlt.table(name="temp_...
Hello @AmarKap ,
When Spark decodes CP1252 bytes as UTF-8/ISO-8859-1, you’ll see the replacement char like �
Can you read the file as :
df = (spark.readStream.format("cloudFiles").option("cloudFiles.format", "text").option("encoding", "windows-1252")...
Hello @raghvendrarm1 ,
Below are the answers to your questions:
Do executors always send “results” to the driver?
No. Only actions that return values (e.g., collect, take, first, count) bring data back to the driver. collect explicitly “returns al...
Hello @DatabricksEngi1 ,
What's the DBR version and DB connect version you are using?
CONNECT_URL_NOT_SET occurs when creating a Spark Connect session without specifying the connect URL. I think you have fallen into DB Connect’s config-resolution rul...