cancel
Showing results forย 
Search instead forย 
Did you mean:ย 
Data Engineering
Join discussions on data engineering best practices, architectures, and optimization strategies within the Databricks Community. Exchange insights and solutions with fellow data engineers.
cancel
Showing results forย 
Search instead forย 
Did you mean:ย 

Convert table in nested JSON

spaz
New Contributor II

What is the easiest way to convert a table to a nested JSON?

1 ACCEPTED SOLUTION

Accepted Solutions

-werners-
Esteemed Contributor III

As Spark can handle nested columns, I would first construct the nested structure in spark (as from spark 3.1.1 there is the excellent column.withField method with which you can create your structure.

Finally write it to json.

That seems to be the easiest way, but your case might be more complex, that is hard to say without some more info.

View solution in original post

5 REPLIES 5

AmanSehgal
Honored Contributor III

@Sergio Pazโ€‹ could you please provide an example of your input table and expected output JSON file?

spaz
New Contributor II

{

  "accounts":[

     "12516898"

  ],

  "deals":[

     {

        "dealId":"4897143",

        "promotionId":"AVC84897143",

        "conditions":{

           "paymentMethod":null,

           "simulationDateTime":{

              "startDateTime":"2022-03-16",

              "endDateTime":"2022-04-30"

           },

           "scaledLineItem":{

              "sharedMinimumQuantity":true,

              "skus":[

                 "000000000000000031"

              ],

              "crossDiscount":true,

              "ranges":[

                 {

                    "index":"1",

                    "from":1,

                    "to":200

                 }

              ]

           }

        },

        "output":{

           "lineItemScaledDiscount":{

              "ranges":[

                 {

                    "index":"1",

                    "skus":[

                       "000000000000000031"

                    ],

                    "type":"%",

                    "discount":5.5,

                    "maxQuantity":null,

                    "proportion":null,

                    "fixed":true

                 }

              ]

           }

        }

     }

  ]

}

-werners-
Esteemed Contributor III

As Spark can handle nested columns, I would first construct the nested structure in spark (as from spark 3.1.1 there is the excellent column.withField method with which you can create your structure.

Finally write it to json.

That seems to be the easiest way, but your case might be more complex, that is hard to say without some more info.

Anonymous
Not applicable

@Sergio Pazโ€‹ - How's it going? Are you able to give us more information?

Hi @Sergio Pazโ€‹ ,

Just a friendly follow-up. Could you provide more details on your use case? please share your code snippet, so we can help you.

Connect with Databricks Users in Your Area

Join a Regional User Group to connect with local Databricks users. Events will be happening in your city, and you wonโ€™t want to miss the chance to attend and share knowledge.

If there isnโ€™t a group near you, start one and help create a community that brings people together.

Request a New Group