<?xml version="1.0" encoding="UTF-8"?>
<rss xmlns:content="http://purl.org/rss/1.0/modules/content/" xmlns:dc="http://purl.org/dc/elements/1.1/" xmlns:rdf="http://www.w3.org/1999/02/22-rdf-syntax-ns#" xmlns:taxo="http://purl.org/rss/1.0/modules/taxonomy/" version="2.0">
  <channel>
    <title>topic Re: Malformed Input Exception while saving or retreiving Table in Data Engineering</title>
    <link>https://community.databricks.com/t5/data-engineering/malformed-input-exception-while-saving-or-retreiving-table/m-p/61248#M31742</link>
    <description>&lt;P&gt;Thanks&amp;nbsp;&lt;a href="https://community.databricks.com/t5/user/viewprofilepage/user-id/9"&gt;@Retired_mod&lt;/a&gt;&amp;nbsp;for your response.&lt;/P&gt;&lt;P&gt;&lt;STRONG&gt;Encoding issues:&amp;nbsp;&lt;/STRONG&gt;I am reading data from a table in same catalog and after bunch transformation saving data to another table in same catalog. I believe Encoding should not be issue.&lt;/P&gt;&lt;P&gt;&lt;STRONG&gt;Check Column Data Types&lt;/STRONG&gt;&lt;SPAN&gt;: Before Saving Data to table I am ensuring the schema matches to table schema by casting columns to corresponding Data Type. I am not explicitly creating schema using&amp;nbsp;StructType. But it matches the table schema at end of transformation. As a result,&amp;nbsp;I am able to save data first time to Table.&lt;/SPAN&gt;&lt;/P&gt;&lt;P&gt;Moreover,&lt;SPAN&gt;&amp;nbsp;when I am testing it the source data is not changed.&amp;nbsp;&lt;/SPAN&gt;&lt;/P&gt;&lt;P&gt;&lt;SPAN&gt;&lt;STRONG&gt;Retry Delay:&amp;nbsp;&lt;/STRONG&gt;I tried putting&amp;nbsp;some delay in between each interaction to delta table. It did not help.&lt;/SPAN&gt;&lt;/P&gt;&lt;P&gt;&lt;STRONG&gt;Delta Table Consistency&lt;/STRONG&gt;&lt;SPAN&gt;: Can I manage or check for consistency programmatically&amp;nbsp;as it appears delta table transactions are managed internally.&lt;/SPAN&gt;&lt;/P&gt;&lt;P&gt;&lt;SPAN&gt;Thank you again very much for your Suggestion.&lt;/SPAN&gt;&lt;/P&gt;</description>
    <pubDate>Tue, 20 Feb 2024 11:47:06 GMT</pubDate>
    <dc:creator>Chandraw</dc:creator>
    <dc:date>2024-02-20T11:47:06Z</dc:date>
    <item>
      <title>Malformed Input Exception while saving or retreiving Table</title>
      <link>https://community.databricks.com/t5/data-engineering/malformed-input-exception-while-saving-or-retreiving-table/m-p/61071#M31713</link>
      <description>&lt;P&gt;Hi everyone,&lt;/P&gt;&lt;P&gt;I am using DBR version 13 and Managed tables in a custom catalog location of table is AWS S3.&lt;/P&gt;&lt;P&gt;running notebook on single user cluster&lt;/P&gt;&lt;P&gt;I am facing MalformedInputException while saving data to Tables or reading it.&lt;/P&gt;&lt;P&gt;When I am running my notebook first time and tables are empty it works fine. Data is saved to tables. But when I try again immediately to run notebook, I am not able to save data and getting Exception in Subject.&lt;/P&gt;&lt;P&gt;However, if run notebook again next day or if delete everything from table&amp;nbsp;it works fine.&lt;/P&gt;&lt;P&gt;I am using df.write.mode('overwrite').saveAsTable('tablename') and for reading delta table DeltaTable.forName().&lt;/P&gt;&lt;P&gt;Error:&lt;/P&gt;&lt;P&gt;&lt;SPAN class=""&gt;Py4JJavaError&lt;/SPAN&gt;&lt;SPAN&gt;: An error occurred while calling z:io.delta.tables.DeltaTable.forName. : /or os4.saveAsTable() java.nio.charset.MalformedInputException: Input length = 1 at java.nio.charset.CoderResult.throwException(CoderResult.java:281) at sun.nio.cs.StreamDecoder.implRead(StreamDecoder.java:339) at sun.nio.cs.StreamDecoder.read(StreamDecoder.java:178) at java.io.InputStreamReader.read(InputStreamReader.java:184) at java.io.BufferedReader.read1(BufferedReader.java:210) at java.io.BufferedReader.read(BufferedReader.java:286) at java.io.Reader.read(Reader.java:140) at scala.io.BufferedSource.mkString(BufferedSource.scala:98) at com.databricks.common.client.RawDBHttpClient.getResponseBody(DBHttpClient.scala:1229) at com.databricks.common.client.RawDBHttpClient.$anonfun$httpRequestInternal$1(DBHttpClient.scala:1191) at com.databricks.logging.UsageLogging.$anonfun$recordOperation$1(UsageLogging.scala:571)&lt;/SPAN&gt;&lt;/P&gt;&lt;P&gt;&lt;SPAN&gt;Thanks for any suggestion in&amp;nbsp;&lt;/SPAN&gt;Advance.&lt;/P&gt;</description>
      <pubDate>Mon, 19 Feb 2024 11:21:01 GMT</pubDate>
      <guid>https://community.databricks.com/t5/data-engineering/malformed-input-exception-while-saving-or-retreiving-table/m-p/61071#M31713</guid>
      <dc:creator>Chandraw</dc:creator>
      <dc:date>2024-02-19T11:21:01Z</dc:date>
    </item>
    <item>
      <title>Re: Malformed Input Exception while saving or retreiving Table</title>
      <link>https://community.databricks.com/t5/data-engineering/malformed-input-exception-while-saving-or-retreiving-table/m-p/61248#M31742</link>
      <description>&lt;P&gt;Thanks&amp;nbsp;&lt;a href="https://community.databricks.com/t5/user/viewprofilepage/user-id/9"&gt;@Retired_mod&lt;/a&gt;&amp;nbsp;for your response.&lt;/P&gt;&lt;P&gt;&lt;STRONG&gt;Encoding issues:&amp;nbsp;&lt;/STRONG&gt;I am reading data from a table in same catalog and after bunch transformation saving data to another table in same catalog. I believe Encoding should not be issue.&lt;/P&gt;&lt;P&gt;&lt;STRONG&gt;Check Column Data Types&lt;/STRONG&gt;&lt;SPAN&gt;: Before Saving Data to table I am ensuring the schema matches to table schema by casting columns to corresponding Data Type. I am not explicitly creating schema using&amp;nbsp;StructType. But it matches the table schema at end of transformation. As a result,&amp;nbsp;I am able to save data first time to Table.&lt;/SPAN&gt;&lt;/P&gt;&lt;P&gt;Moreover,&lt;SPAN&gt;&amp;nbsp;when I am testing it the source data is not changed.&amp;nbsp;&lt;/SPAN&gt;&lt;/P&gt;&lt;P&gt;&lt;SPAN&gt;&lt;STRONG&gt;Retry Delay:&amp;nbsp;&lt;/STRONG&gt;I tried putting&amp;nbsp;some delay in between each interaction to delta table. It did not help.&lt;/SPAN&gt;&lt;/P&gt;&lt;P&gt;&lt;STRONG&gt;Delta Table Consistency&lt;/STRONG&gt;&lt;SPAN&gt;: Can I manage or check for consistency programmatically&amp;nbsp;as it appears delta table transactions are managed internally.&lt;/SPAN&gt;&lt;/P&gt;&lt;P&gt;&lt;SPAN&gt;Thank you again very much for your Suggestion.&lt;/SPAN&gt;&lt;/P&gt;</description>
      <pubDate>Tue, 20 Feb 2024 11:47:06 GMT</pubDate>
      <guid>https://community.databricks.com/t5/data-engineering/malformed-input-exception-while-saving-or-retreiving-table/m-p/61248#M31742</guid>
      <dc:creator>Chandraw</dc:creator>
      <dc:date>2024-02-20T11:47:06Z</dc:date>
    </item>
    <item>
      <title>Re: Malformed Input Exception while saving or retreiving Table</title>
      <link>https://community.databricks.com/t5/data-engineering/malformed-input-exception-while-saving-or-retreiving-table/m-p/62110#M31899</link>
      <description>&lt;P&gt;&lt;a href="https://community.databricks.com/t5/user/viewprofilepage/user-id/9"&gt;@Retired_mod&lt;/a&gt;&amp;nbsp; The issue is resolved as soon as I deployed it to mutlinode dev cluster.&lt;/P&gt;&lt;P&gt;Issue is only occurring in single user clusters. Looks like limitation of running all updates in one node as distributed system.&lt;/P&gt;</description>
      <pubDate>Tue, 27 Feb 2024 12:15:42 GMT</pubDate>
      <guid>https://community.databricks.com/t5/data-engineering/malformed-input-exception-while-saving-or-retreiving-table/m-p/62110#M31899</guid>
      <dc:creator>Chandraw</dc:creator>
      <dc:date>2024-02-27T12:15:42Z</dc:date>
    </item>
  </channel>
</rss>

