.overwriteschema + writestream

PiotrU
Contributor II

HelloI have issue with overwriting schema while using writestream - I do not receive any error - however schema remain unchanged

Below example

df_abc = spark.readstream

   .format("cloudFiles")

   .option("cloudFiles.format", "parquet") 

   .option"cloudFiles.schemaLocation", chklocat )

   .load(deltatbl)

df_abc = df_abc.withColumn("columna", col("columna").cast("timestamp"))

write = df_abc.writestream

   .outputMode("append")

   .option("checkpointLocation",chklocat)

   .trigger(availableNow=True)

   .option("overwriteSchema", "true")

   .toTable(dbname + "." + tblname)

Hey Kaniz, not sure do I follow

- overwriteSchema option was set up as you have written

- session configuration is set-up correctly

I have also tried several ways configuration including set up of "mergeSchema", "true" but still doesn't work

PiotrU
Contributor II

That did not solve the problem