<?xml version="1.0" encoding="UTF-8"?>
<rss xmlns:content="http://purl.org/rss/1.0/modules/content/" xmlns:dc="http://purl.org/dc/elements/1.1/" xmlns:rdf="http://www.w3.org/1999/02/22-rdf-syntax-ns#" xmlns:taxo="http://purl.org/rss/1.0/modules/taxonomy/" version="2.0">
  <channel>
    <title>topic I met with an issue when I was trying to use autoloader to read json files from Azure ADLS Gen2. I am getting this issue for specific files only. I checked the file are good and not corrupted. in Data Engineering</title>
    <link>https://community.databricks.com/t5/data-engineering/i-met-with-an-issue-when-i-was-trying-to-use-autoloader-to-read/m-p/19178#M12829</link>
    <description>&lt;P&gt;I met with an issue when I was trying to use autoloader to read json files from Azure ADLS Gen2. I am getting this issue for specific files only. I checked the file are good and not corrupted.&lt;/P&gt;&lt;P&gt;&lt;/P&gt;&lt;P&gt;&lt;B&gt;Following is the issue:&lt;/B&gt;&lt;/P&gt;&lt;P&gt;&lt;/P&gt;&lt;P&gt;&lt;B&gt;&lt;I&gt;java.lang.IllegalArgumentException: requirement failed: Literal must have a corresponding value to string, but class Integer found.&lt;/I&gt;&lt;/B&gt;&lt;/P&gt;&lt;P&gt;&lt;/P&gt;&lt;P&gt;&lt;B&gt;&lt;I&gt;com.databricks.sql.io.FileReadException: Error while reading file /mnt/Source/kafka/customer_raw/filtered_data/year=2022/month=11/day=9/hour=15/part-00000-31413bcf-0a8f-480f-8d45-6970f4c4c9f7.c000.json.&lt;/I&gt;&lt;/B&gt;&lt;/P&gt;&lt;P&gt;&lt;/P&gt;&lt;P&gt;&lt;B&gt;&lt;I&gt;&lt;U&gt;Detailed error attaching as a file:&lt;/U&gt;&lt;/I&gt;&lt;/B&gt;&lt;/P&gt;&lt;P&gt;&lt;/P&gt;&lt;P&gt;&lt;B&gt;&amp;nbsp;I am using Delta Live Pipeline. Here is the code:&lt;/B&gt;&lt;/P&gt;&lt;P&gt;&lt;/P&gt;&lt;P&gt;@dlt.table(&lt;/P&gt;&lt;P&gt;&lt;/P&gt;&lt;P&gt;&amp;nbsp;name = tablename,&lt;/P&gt;&lt;P&gt;&lt;/P&gt;&lt;P&gt;&amp;nbsp;comment = "Create Bronze Table",&lt;/P&gt;&lt;P&gt;&lt;/P&gt;&lt;P&gt;&amp;nbsp;table_properties={&lt;/P&gt;&lt;P&gt;&lt;/P&gt;&lt;P&gt;&amp;nbsp;"quality": "bronze"&lt;/P&gt;&lt;P&gt;&lt;/P&gt;&lt;P&gt;&amp;nbsp;}&lt;/P&gt;&lt;P&gt;&lt;/P&gt;&lt;P&gt;)&lt;/P&gt;&lt;P&gt;&lt;/P&gt;&lt;P&gt;def Bronze_Table_Create():&lt;/P&gt;&lt;P&gt;&lt;/P&gt;&lt;P&gt;&amp;nbsp;return (&lt;/P&gt;&lt;P&gt;&lt;/P&gt;&lt;P&gt;&amp;nbsp;spark&lt;/P&gt;&lt;P&gt;&lt;/P&gt;&lt;P&gt;&amp;nbsp;.readStream&lt;/P&gt;&lt;P&gt;&amp;nbsp;.schema(schemapath)&lt;/P&gt;&lt;P&gt;&amp;nbsp;.format("cloudFiles")&lt;/P&gt;&lt;P&gt;&lt;/P&gt;&lt;P&gt;&amp;nbsp;.option("cloudFiles.format", "json")&lt;/P&gt;&lt;P&gt;&lt;/P&gt;&lt;P&gt;&amp;nbsp;.option("cloudFiles.schemaLocation", schemalocation) &lt;/P&gt;&lt;P&gt;&lt;/P&gt;&lt;P&gt;&amp;nbsp;.option("cloudFiles.inferColumnTypes", "false")&lt;/P&gt;&lt;P&gt;&lt;/P&gt;&lt;P&gt;&amp;nbsp;.option("cloudFiles.schemaEvolutionMode", "rescue")&amp;nbsp;&amp;nbsp;&lt;/P&gt;&lt;P&gt;&lt;/P&gt;&lt;P&gt;&amp;nbsp;.load(sourcelocation)&lt;/P&gt;&lt;P&gt;&lt;/P&gt;&lt;P&gt;&lt;/P&gt;&lt;P&gt;&lt;/P&gt;&lt;P&gt;&amp;nbsp;)&lt;/P&gt;&lt;P&gt;This is too urgent. Any help is highly appreciated.&lt;/P&gt;</description>
    <pubDate>Fri, 02 Dec 2022 01:10:41 GMT</pubDate>
    <dc:creator>SRK</dc:creator>
    <dc:date>2022-12-02T01:10:41Z</dc:date>
    <item>
      <title>I met with an issue when I was trying to use autoloader to read json files from Azure ADLS Gen2. I am getting this issue for specific files only. I checked the file are good and not corrupted.</title>
      <link>https://community.databricks.com/t5/data-engineering/i-met-with-an-issue-when-i-was-trying-to-use-autoloader-to-read/m-p/19178#M12829</link>
      <description>&lt;P&gt;I met with an issue when I was trying to use autoloader to read json files from Azure ADLS Gen2. I am getting this issue for specific files only. I checked the file are good and not corrupted.&lt;/P&gt;&lt;P&gt;&lt;/P&gt;&lt;P&gt;&lt;B&gt;Following is the issue:&lt;/B&gt;&lt;/P&gt;&lt;P&gt;&lt;/P&gt;&lt;P&gt;&lt;B&gt;&lt;I&gt;java.lang.IllegalArgumentException: requirement failed: Literal must have a corresponding value to string, but class Integer found.&lt;/I&gt;&lt;/B&gt;&lt;/P&gt;&lt;P&gt;&lt;/P&gt;&lt;P&gt;&lt;B&gt;&lt;I&gt;com.databricks.sql.io.FileReadException: Error while reading file /mnt/Source/kafka/customer_raw/filtered_data/year=2022/month=11/day=9/hour=15/part-00000-31413bcf-0a8f-480f-8d45-6970f4c4c9f7.c000.json.&lt;/I&gt;&lt;/B&gt;&lt;/P&gt;&lt;P&gt;&lt;/P&gt;&lt;P&gt;&lt;B&gt;&lt;I&gt;&lt;U&gt;Detailed error attaching as a file:&lt;/U&gt;&lt;/I&gt;&lt;/B&gt;&lt;/P&gt;&lt;P&gt;&lt;/P&gt;&lt;P&gt;&lt;B&gt;&amp;nbsp;I am using Delta Live Pipeline. Here is the code:&lt;/B&gt;&lt;/P&gt;&lt;P&gt;&lt;/P&gt;&lt;P&gt;@dlt.table(&lt;/P&gt;&lt;P&gt;&lt;/P&gt;&lt;P&gt;&amp;nbsp;name = tablename,&lt;/P&gt;&lt;P&gt;&lt;/P&gt;&lt;P&gt;&amp;nbsp;comment = "Create Bronze Table",&lt;/P&gt;&lt;P&gt;&lt;/P&gt;&lt;P&gt;&amp;nbsp;table_properties={&lt;/P&gt;&lt;P&gt;&lt;/P&gt;&lt;P&gt;&amp;nbsp;"quality": "bronze"&lt;/P&gt;&lt;P&gt;&lt;/P&gt;&lt;P&gt;&amp;nbsp;}&lt;/P&gt;&lt;P&gt;&lt;/P&gt;&lt;P&gt;)&lt;/P&gt;&lt;P&gt;&lt;/P&gt;&lt;P&gt;def Bronze_Table_Create():&lt;/P&gt;&lt;P&gt;&lt;/P&gt;&lt;P&gt;&amp;nbsp;return (&lt;/P&gt;&lt;P&gt;&lt;/P&gt;&lt;P&gt;&amp;nbsp;spark&lt;/P&gt;&lt;P&gt;&lt;/P&gt;&lt;P&gt;&amp;nbsp;.readStream&lt;/P&gt;&lt;P&gt;&amp;nbsp;.schema(schemapath)&lt;/P&gt;&lt;P&gt;&amp;nbsp;.format("cloudFiles")&lt;/P&gt;&lt;P&gt;&lt;/P&gt;&lt;P&gt;&amp;nbsp;.option("cloudFiles.format", "json")&lt;/P&gt;&lt;P&gt;&lt;/P&gt;&lt;P&gt;&amp;nbsp;.option("cloudFiles.schemaLocation", schemalocation) &lt;/P&gt;&lt;P&gt;&lt;/P&gt;&lt;P&gt;&amp;nbsp;.option("cloudFiles.inferColumnTypes", "false")&lt;/P&gt;&lt;P&gt;&lt;/P&gt;&lt;P&gt;&amp;nbsp;.option("cloudFiles.schemaEvolutionMode", "rescue")&amp;nbsp;&amp;nbsp;&lt;/P&gt;&lt;P&gt;&lt;/P&gt;&lt;P&gt;&amp;nbsp;.load(sourcelocation)&lt;/P&gt;&lt;P&gt;&lt;/P&gt;&lt;P&gt;&lt;/P&gt;&lt;P&gt;&lt;/P&gt;&lt;P&gt;&amp;nbsp;)&lt;/P&gt;&lt;P&gt;This is too urgent. Any help is highly appreciated.&lt;/P&gt;</description>
      <pubDate>Fri, 02 Dec 2022 01:10:41 GMT</pubDate>
      <guid>https://community.databricks.com/t5/data-engineering/i-met-with-an-issue-when-i-was-trying-to-use-autoloader-to-read/m-p/19178#M12829</guid>
      <dc:creator>SRK</dc:creator>
      <dc:date>2022-12-02T01:10:41Z</dc:date>
    </item>
    <item>
      <title>Re: I met with an issue when I was trying to use autoloader to read json files from Azure ADLS Gen2. I am getting this issue for specific files only. I checked the file are good and not corrupted.</title>
      <link>https://community.databricks.com/t5/data-engineering/i-met-with-an-issue-when-i-was-trying-to-use-autoloader-to-read/m-p/19179#M12830</link>
      <description>&lt;P&gt;Hey @Swapnil Kamle​&amp;nbsp;, can you try keeping inferColumnTypes to true, by default JSON should consider all columns as string, not sure why it is failing.&lt;/P&gt;</description>
      <pubDate>Fri, 02 Dec 2022 05:17:54 GMT</pubDate>
      <guid>https://community.databricks.com/t5/data-engineering/i-met-with-an-issue-when-i-was-trying-to-use-autoloader-to-read/m-p/19179#M12830</guid>
      <dc:creator>Geeta1</dc:creator>
      <dc:date>2022-12-02T05:17:54Z</dc:date>
    </item>
    <item>
      <title>Re: I met with an issue when I was trying to use autoloader to read json files from Azure ADLS Gen2. I am getting this issue for specific files only. I checked the file are good and not corrupted.</title>
      <link>https://community.databricks.com/t5/data-engineering/i-met-with-an-issue-when-i-was-trying-to-use-autoloader-to-read/m-p/19180#M12831</link>
      <description>&lt;P&gt;I can't make InferColumnTypes to true, as i am passing the schema explicitly. i don't want to infer columns. It's failing for few files only. I checked the files as well. however the files looks good.&lt;/P&gt;</description>
      <pubDate>Fri, 02 Dec 2022 05:33:12 GMT</pubDate>
      <guid>https://community.databricks.com/t5/data-engineering/i-met-with-an-issue-when-i-was-trying-to-use-autoloader-to-read/m-p/19180#M12831</guid>
      <dc:creator>SRK</dc:creator>
      <dc:date>2022-12-02T05:33:12Z</dc:date>
    </item>
    <item>
      <title>Re: I met with an issue when I was trying to use autoloader to read json files from Azure ADLS Gen2. I am getting this issue for specific files only. I checked the file are good and not corrupted.</title>
      <link>https://community.databricks.com/t5/data-engineering/i-met-with-an-issue-when-i-was-trying-to-use-autoloader-to-read/m-p/19181#M12832</link>
      <description>&lt;P&gt;I got the issue resolved. The issues was by mistake we have duplicate columns in the schema files. Because of that it was showing that error. However, the error is totally mis-leading, that's why didn't able to rectify it.&lt;/P&gt;</description>
      <pubDate>Fri, 02 Dec 2022 09:34:56 GMT</pubDate>
      <guid>https://community.databricks.com/t5/data-engineering/i-met-with-an-issue-when-i-was-trying-to-use-autoloader-to-read/m-p/19181#M12832</guid>
      <dc:creator>SRK</dc:creator>
      <dc:date>2022-12-02T09:34:56Z</dc:date>
    </item>
  </channel>
</rss>

