<?xml version="1.0" encoding="UTF-8"?>
<rss xmlns:content="http://purl.org/rss/1.0/modules/content/" xmlns:dc="http://purl.org/dc/elements/1.1/" xmlns:rdf="http://www.w3.org/1999/02/22-rdf-syntax-ns#" xmlns:taxo="http://purl.org/rss/1.0/modules/taxonomy/" version="2.0">
  <channel>
    <title>topic Re: Databricks Data Type Conversion error in Data Engineering</title>
    <link>https://community.databricks.com/t5/data-engineering/databricks-data-type-conversion-error/m-p/33602#M24573</link>
    <description>&lt;P&gt;What happens if you explicitly cast it?&lt;/P&gt;&lt;P&gt;I remember having such issues with implicit casting when goin from spark 2.x to 3.x, but these were solved by using explicit casting (not round()).&lt;/P&gt;</description>
    <pubDate>Wed, 08 Dec 2021 08:47:43 GMT</pubDate>
    <dc:creator>-werners-</dc:creator>
    <dc:date>2021-12-08T08:47:43Z</dc:date>
    <item>
      <title>Databricks Data Type Conversion error</title>
      <link>https://community.databricks.com/t5/data-engineering/databricks-data-type-conversion-error/m-p/33600#M24571</link>
      <description>&lt;P&gt;In databricks while writing data to curated layer, see error - Failed to&amp;nbsp;execute user defined function (Double =&amp;gt; decimal(38,18)). Does anyone know if faced such issue and how to resolve it.&lt;/P&gt;</description>
      <pubDate>Tue, 07 Dec 2021 19:10:38 GMT</pubDate>
      <guid>https://community.databricks.com/t5/data-engineering/databricks-data-type-conversion-error/m-p/33600#M24571</guid>
      <dc:creator>Vibhor</dc:creator>
      <dc:date>2021-12-07T19:10:38Z</dc:date>
    </item>
    <item>
      <title>Re: Databricks Data Type Conversion error</title>
      <link>https://community.databricks.com/t5/data-engineering/databricks-data-type-conversion-error/m-p/33601#M24572</link>
      <description>&lt;P&gt;Hi Vibhor,&lt;/P&gt;&lt;P&gt;&lt;/P&gt;&lt;P&gt;Thanks for your question. Let's give it a while to see what the community comes up with. We'll circle back if we need to.&lt;/P&gt;</description>
      <pubDate>Tue, 07 Dec 2021 21:17:46 GMT</pubDate>
      <guid>https://community.databricks.com/t5/data-engineering/databricks-data-type-conversion-error/m-p/33601#M24572</guid>
      <dc:creator>Anonymous</dc:creator>
      <dc:date>2021-12-07T21:17:46Z</dc:date>
    </item>
    <item>
      <title>Re: Databricks Data Type Conversion error</title>
      <link>https://community.databricks.com/t5/data-engineering/databricks-data-type-conversion-error/m-p/33602#M24573</link>
      <description>&lt;P&gt;What happens if you explicitly cast it?&lt;/P&gt;&lt;P&gt;I remember having such issues with implicit casting when goin from spark 2.x to 3.x, but these were solved by using explicit casting (not round()).&lt;/P&gt;</description>
      <pubDate>Wed, 08 Dec 2021 08:47:43 GMT</pubDate>
      <guid>https://community.databricks.com/t5/data-engineering/databricks-data-type-conversion-error/m-p/33602#M24573</guid>
      <dc:creator>-werners-</dc:creator>
      <dc:date>2021-12-08T08:47:43Z</dc:date>
    </item>
    <item>
      <title>Re: Databricks Data Type Conversion error</title>
      <link>https://community.databricks.com/t5/data-engineering/databricks-data-type-conversion-error/m-p/33603#M24574</link>
      <description>&lt;P&gt;When we multiply 2 numbers; the result size could go to the tune of 2x (num1 size + num2 size). Before we are explicitly casting multiplication of these 2, the result is bigger than to be casted data type. Its something like (double (x,y) can't be converted to decimal (x,y-1)). If we decrease the data type size of the values to me multiplied before multiplication we are getting incorrect value due to precision loss.&lt;/P&gt;</description>
      <pubDate>Wed, 08 Dec 2021 17:51:56 GMT</pubDate>
      <guid>https://community.databricks.com/t5/data-engineering/databricks-data-type-conversion-error/m-p/33603#M24574</guid>
      <dc:creator>Vibhor</dc:creator>
      <dc:date>2021-12-08T17:51:56Z</dc:date>
    </item>
    <item>
      <title>Re: Databricks Data Type Conversion error</title>
      <link>https://community.databricks.com/t5/data-engineering/databricks-data-type-conversion-error/m-p/33605#M24576</link>
      <description>&lt;P&gt;Scala 2.11 spark 2.4.5&lt;/P&gt;</description>
      <pubDate>Tue, 08 Feb 2022 18:10:49 GMT</pubDate>
      <guid>https://community.databricks.com/t5/data-engineering/databricks-data-type-conversion-error/m-p/33605#M24576</guid>
      <dc:creator>Vibhor</dc:creator>
      <dc:date>2022-02-08T18:10:49Z</dc:date>
    </item>
    <item>
      <title>Re: Databricks Data Type Conversion error</title>
      <link>https://community.databricks.com/t5/data-engineering/databricks-data-type-conversion-error/m-p/33606#M24577</link>
      <description>&lt;P&gt;Hi @Vibhor Sethi​&amp;nbsp;,&lt;/P&gt;&lt;P&gt;&lt;/P&gt;&lt;P&gt;Could you please share more details on this issue? whats the error stack trace? could you share your code? are you writing to Delta or Parquet or which format?&lt;/P&gt;</description>
      <pubDate>Tue, 15 Mar 2022 00:15:40 GMT</pubDate>
      <guid>https://community.databricks.com/t5/data-engineering/databricks-data-type-conversion-error/m-p/33606#M24577</guid>
      <dc:creator>jose_gonzalez</dc:creator>
      <dc:date>2022-03-15T00:15:40Z</dc:date>
    </item>
  </channel>
</rss>

