Copying file from DBFS to a table of Databricks, Is there a way to get the errors at record level ?
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
07-12-2024 07:42 AM
We have file of data to be ingested into a table of Databricks. Following below approach,
- Uploaded file to DBFS
- Creating a temporary table and loading above file to the temporary table. CREATE TABLE [USING]
- Use MERGE INTO to merge temp_table created in step2 with target table.
While doing this operation, there could be failure at step2 or step3.
Observing the behaviour that, if there are any anomalies or incorrect format in the input file or target table, entire MERGE INTO / CREATE TABLE [USING] command fails.
Is there a way capture the reason for failure at record level ? Tried supplying the argument badRecordsPath as mentioned in the documentation didn't find any error files being created.
Is it safe to assume all records gets into the temp table via the CREATE TABLE command, or none if there are any anomalies with a single record? It's not partial, correct? Same for the MERGE_INTO command—it's not partial, request to confirm.
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
07-19-2024 05:48 AM
Hi @inagar ,
Thank you for reaching out to our community! We're here to help you.
To ensure we provide you with the best support, could you please take a moment to review the response and choose the one that best answers your question? Your feedback not only helps us assist you better but also benefits other community members who may have similar questions in the future.
If you found the answer helpful, consider giving it a kudo. If the response fully addresses your question, please mark it as the accepted solution. This will help us close the thread and ensure your question is resolved.
We appreciate your participation and are here to assist you further if you need it!
Thanks,
Rishabh

