To access the s.d field from your Delta table in Power BI, you need the content of the s column to be correctly formatted as JSON so that Power BI's Json.Document() function can parse it. Your issue arises because the default string representation of struct<d:date> in Databricks returns {"d":2020-01-01}, which is not valid JSON since string valuesโlike datesโneed to be quoted: {"d":"2020-01-01"}.
The s_json column works fine because it uses to_json(), which produces a properly quoted JSON string, unlike the default cast of struct to string.
Why This Happens
-
Databricks generates a simplified, spark-internal JSON-like representation for struct values when you SELECT or export them directly, not standard JSON (dates are not in quotes).
-
Power BI's Json.Document() requires strict JSON compliance: string values must be quoted.
Solutions
1. Use to_json() in Your Databricks Views/Queries
Ensure that the data you supply to Power BI for complex columns like structs is always pre-serialized to a valid JSON string.
For example, in your Databricks SQL, use:
SELECT to_json(s) as s_json, d FROM date_struct
And then let Power BI use the s_json column, as it already does in your example. This is the cleanest and most reliable approach.
2. Modify Power BI M Query to Parse the Malformed JSON
If you cannot change your Databricks view/table, you can fix the malformed JSON in Power BI before calling Json.Document(). You can substitute the date pattern without quotes, e.g.:
let
// previous steps,
FixedS = Table.TransformColumns(ds1, {
"s", each Text.Replace(Text.From(_), "2020-01-01", """2020-01-01"""), type text}
),
AddedRecord = Table.AddColumn(FixedS, "record_from_s", each Json.Document([s]))
in
AddedRecord
However, this approach is fragile because it only covers known date values and structures. It's much better to fix this at the source using to_json().
3. Project Struct Fields Directly
If you just need a field like s.d, select it directly in your SQL:
SELECT s.d, d FROM date_struct
This gives a flat table without needing JSON conversion.
Recommendation
Always use to_json() in your Databricks SQL when exporting struct columns to Power BI. This guarantees your data is valid JSON and savable as a string, which can then be parsed by Power BI's Json.Document() with no issues.