01-17-2022 03:04 PM
I have complex json file which has massive struct column. We regularly have issues when we try to parse this json file by forming our case class to extract the fields from schema. With this approach the issue we are facing is that if one data type of field within the case class is incorrect, the rest of the following fields in that class do not populate in the target. Hope the problem makes sense.
Is there any alternate way? One I can think of is to extract all the fields as string from json file and then do the data type conversion. This adds an extra step. A better solution is appreciated. Thanks.
01-18-2022 01:11 PM
I think solution for your problem is use auto loader stream to read data as it support schema hints. If you don't want to use it as stream is enough to specify there trigger once (so once all json are loaded it will finish a job).
Here is about loading json:
https://docs.databricks.com/spark/latest/structured-streaming/auto-loader-json.html
then you can specify schema hints:
https://docs.databricks.com/spark/latest/structured-streaming/auto-loader-schema.html#schema-hints
additionally you can experiment with different schema evolution options for stream
01-18-2022 01:11 PM
I think solution for your problem is use auto loader stream to read data as it support schema hints. If you don't want to use it as stream is enough to specify there trigger once (so once all json are loaded it will finish a job).
Here is about loading json:
https://docs.databricks.com/spark/latest/structured-streaming/auto-loader-json.html
then you can specify schema hints:
https://docs.databricks.com/spark/latest/structured-streaming/auto-loader-schema.html#schema-hints
additionally you can experiment with different schema evolution options for stream
01-18-2022 05:19 PM
Thanks Hubert! I did have Autoloader as one of the solution and I think this is a viable option to make sure I do not have schema parsing issues.
01-19-2022 08:43 AM
Hey there, @Matt M - If @Hubert Dudek's response solved the issue, would you be happy to mark his answer as best? It helps other members find the solution more quickly.
01-20-2022 07:23 AM
Yes, thanks.
Join a Regional User Group to connect with local Databricks users. Events will be happening in your city, and you won’t want to miss the chance to attend and share knowledge.
If there isn’t a group near you, start one and help create a community that brings people together.
Request a New Group