Hi @SatyaKoduri
This is a known issue with newer Spark versions (3.5+) that came with Databricks Runtime 15.4.
The schema inference has become more strict and struggles with deeply nested structures like your YAML's nested maps.
Here are a few solutions:
Option 1: Flatten the structure before creating DataFrame
Option 2: Convert nested structures to JSON strings
Option 3: Use a more explicit schema (flexible but structured)
Option 4: Force schema inference with RDD approach
The flattening approach (Option 1) is probably your best bet if you want to maintain the flexibility you had in 13.3 while working with the stricter schema inference in 15.4. It converts your nested structure into a flat key-value format that Spark can easily handle.
LR