Catch rejected Data ( Rows ) while reading with Apache-Spark.
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
11-16-2021 09:36 AM
I work with Spark-Scala and I receive Data in different formats ( .csv/.xlxs/.txt etc ), when I try to read/write this data from different sources to a any database, many records got rejected due to various issues like (special characters, data type difference between source and target table etc. In such cases, my entire load gets failed.
what I want is a way to capture the rejected rows into separate file and continue to load remaining correct records in database table.
basically not to stop the flow of the program due to some rows, and catch these problem causing rows.
example -
I read a .csv with 98 perfect rows and 2 corrupt rows, I want to read/write 98 rows into the database and send 2 corrupt rows to the user as a file.
P.S. I am receiving data from user so i can't define a schema, i need a dynamic way to read the file and filter out the corrupt data in a file.
- Labels:
-
Apache
-
Apache spark
-
Data
-
Spark
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
11-16-2021 10:01 AM
you can save corrupted records to separate file:
.option("badRecordsPath", "/tmp/badRecordsPath")
allow spark to process corrupted row:
.option("mode", "PERMISSIVE")
you can also create special column for corrupted records:
df = spark.read.csv('/tmp/inputFile.csv', header=True, schema=dataSchema, enforceSchema=True,
columnNameOfCorruptRecord='CORRUPTED')
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
11-16-2021 10:47 AM
Thank you for replying, but what I am trying to develop is a function that can take data from user and filter out corrupt records if any, that is I want to do the same thing you did but without a defined Schema.
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
11-16-2021 10:48 AM
I might get data from some external source and i can't define a schema for the data which is read on my website/app.
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
11-16-2021 10:51 AM
maybe Delta Live tables?
Not sure if it is what you are looking for, haven't used it myself. But you have schema evolution and expectations so it might bring you there.
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
11-16-2021 11:00 AM
or maybe schema evolution on delta lake is enough, in combination with Hubert's answer

