I get the following error com.databricks.sql.transaction.tahoe.DeltaAnalysisException: [DELTA_INVALID_CHARACTERS_IN_COLUMN_NAMES] Found invalid character(s) among ' ,;{}()\n\t=' in the column names of your schema.
It's a new instance of databricks and I've checked the CSV headers. They are all valid with no special characters in the column names.
This is my code
device_path = "dbfs:/mnt/dblakehouse/RawLanding/ysoft/device"
(spark.readStream
.format("cloudFiles")
.option("cloudFiles.format", "csv")
.option("cloudFiles.inferColumnTypes", "true")
.option("cloudFiles.schemaLocation", f"{device_path}/checkpointLocation")
.load(f"{device_path}/")
.writeStream
.option("checkpointLocation", f"{device_path}/checkpointLocation")
.option("mergeSchema", "true")
.outputMode("append")
.toTable("printing_poc01.bronze.smartq_devices")
)