cancel
Showing results for 
Search instead for 
Did you mean: 
Data Engineering
Join discussions on data engineering best practices, architectures, and optimization strategies within the Databricks Community. Exchange insights and solutions with fellow data engineers.
cancel
Showing results for 
Search instead for 
Did you mean: 

Convert table in nested JSON

spaz
New Contributor II

What is the easiest way to convert a table to a nested JSON?

1 ACCEPTED SOLUTION

Accepted Solutions

-werners-
Esteemed Contributor III

As Spark can handle nested columns, I would first construct the nested structure in spark (as from spark 3.1.1 there is the excellent column.withField method with which you can create your structure.

Finally write it to json.

That seems to be the easiest way, but your case might be more complex, that is hard to say without some more info.

View solution in original post

5 REPLIES 5

AmanSehgal
Honored Contributor III

@Sergio Paz​ could you please provide an example of your input table and expected output JSON file?

spaz
New Contributor II

{

  "accounts":[

     "12516898"

  ],

  "deals":[

     {

        "dealId":"4897143",

        "promotionId":"AVC84897143",

        "conditions":{

           "paymentMethod":null,

           "simulationDateTime":{

              "startDateTime":"2022-03-16",

              "endDateTime":"2022-04-30"

           },

           "scaledLineItem":{

              "sharedMinimumQuantity":true,

              "skus":[

                 "000000000000000031"

              ],

              "crossDiscount":true,

              "ranges":[

                 {

                    "index":"1",

                    "from":1,

                    "to":200

                 }

              ]

           }

        },

        "output":{

           "lineItemScaledDiscount":{

              "ranges":[

                 {

                    "index":"1",

                    "skus":[

                       "000000000000000031"

                    ],

                    "type":"%",

                    "discount":5.5,

                    "maxQuantity":null,

                    "proportion":null,

                    "fixed":true

                 }

              ]

           }

        }

     }

  ]

}

-werners-
Esteemed Contributor III

As Spark can handle nested columns, I would first construct the nested structure in spark (as from spark 3.1.1 there is the excellent column.withField method with which you can create your structure.

Finally write it to json.

That seems to be the easiest way, but your case might be more complex, that is hard to say without some more info.

Anonymous
Not applicable

@Sergio Paz​ - How's it going? Are you able to give us more information?

Hi @Sergio Paz​ ,

Just a friendly follow-up. Could you provide more details on your use case? please share your code snippet, so we can help you.

Join Us as a Local Community Builder!

Passionate about hosting events and connecting people? Help us grow a vibrant local community—sign up today to get started!

Sign Up Now