Trigger:
Perform 'Full refresh all' on a DLT pipeline (new or existing). The existing DLT table already existed beforehand.
Issue:
Getting the error 'DeltaColumnMappingUnsupportedException' during "Setting up tables" stage.
```com.databricks.sql.transaction.tahoe.DeltaColumnMappingUnsupportedException:
Schema change is detected:
old schema:
root
new schema:
root
|-- Field 1: string (nullable = true)
|-- Field 2: string (nullable = true)
|-- Field 3: timestamp (nullable = true)
Schema changes are not allowed during the change of column mapping mode.
```
Setup:
The DLT pipeline is configured to run a python notebook that contains something like this:
```
@dlt.table(
comment="Raw table",
table_properties={
"delta.minReaderVersion": "2",
"delta.minWriterVersion": "5",
"delta.columnMapping.mode": "name",
},
)
def raw_table():
return (spark.read.format(file_type)
.option("inferSchema", false)
.option("header", true)
.load(some_file_location))
```
Appreciate any insights to this issue. Thanks.