01-03-2023 09:51 AM
I have been exploring Autoloader to ingest gzipped JSON files from an S3 source.
The notebook fails in the first run due to schema mismatch, after re-running the notebook, the schema evolves and the ingestion runs successfully.
On analysing the schema for the delta table created as a result of the ingestion, I found there are two new columns `id` and `optionsDefaults`.
These columns are not there in the original data, nor do they contain any value and are just nulls.
Is there something I might be missing out on...?
01-13-2023 02:26 AM
Hi @Debayan Mukherjee , @Kaniz Fatma
Thank you for replying to my question.
I was able to figure out the issue. I was creating the schema and checkpoint folders in the same path as the source location for the autoloader. This caused the schema to change every time the autoloader notebook ran as the source data now included schema and checkpoint metadata as well.
I fixed this by providing a location for schema and checkpoint different from the source location.
01-09-2023 12:05 PM
Hi, Could you please provide a screenshot (before and after) and also, if possible, notebook content?
01-11-2023 06:49 AM
Hi @Keshav Saini, We haven’t heard from you since the last response from @Debayan Mukherjee , and I was checking back to see if his suggestions helped you.
Or else, If you have any solution, please share it with the community, as it can be helpful to others.
Also, Please don't forget to click on the "Select As Best" button whenever the information provided helps resolve your question.
01-13-2023 02:26 AM
Hi @Debayan Mukherjee , @Kaniz Fatma
Thank you for replying to my question.
I was able to figure out the issue. I was creating the schema and checkpoint folders in the same path as the source location for the autoloader. This caused the schema to change every time the autoloader notebook ran as the source data now included schema and checkpoint metadata as well.
I fixed this by providing a location for schema and checkpoint different from the source location.
01-13-2023 04:05 AM
Hi @Keshav Saini, I sincerely appreciate your help with the question you've posted. Thank you for being a valuable member of our community.
Join a Regional User Group to connect with local Databricks users. Events will be happening in your city, and you won’t want to miss the chance to attend and share knowledge.
If there isn’t a group near you, start one and help create a community that brings people together.
Request a New Group