<?xml version="1.0" encoding="UTF-8"?>
<rss xmlns:content="http://purl.org/rss/1.0/modules/content/" xmlns:dc="http://purl.org/dc/elements/1.1/" xmlns:rdf="http://www.w3.org/1999/02/22-rdf-syntax-ns#" xmlns:taxo="http://purl.org/rss/1.0/modules/taxonomy/" version="2.0">
  <channel>
    <title>topic Float Value change when Load with spark? Full Path? in Data Engineering</title>
    <link>https://community.databricks.com/t5/data-engineering/float-value-change-when-load-with-spark-full-path/m-p/35148#M25820</link>
    <description>&lt;P&gt;Hello,&lt;/P&gt;&lt;P&gt;&lt;/P&gt;&lt;P&gt;I have created my table in Databricks, at this point everything is perfect i got the same value than in my CSV. &lt;/P&gt;&lt;P&gt;&lt;/P&gt;&lt;P&gt;for my column "Exposure" I have :&lt;/P&gt;&lt;P&gt;&lt;/P&gt;&lt;PRE&gt;&lt;CODE&gt;0         0,00
1         0,00
2         0,00
3         0,00
4         0,00
...&lt;/CODE&gt;&lt;/PRE&gt;&lt;P&gt;But when I load my file with spark, in the column exposure I have something different:&lt;/P&gt;&lt;PRE&gt;&lt;CODE&gt;0        14 032,24
1        14 032,24
2         8 061,94
3         8 061,94
4        15 506,37&lt;/CODE&gt;&lt;/PRE&gt;&lt;P&gt;I use this code to load by table:&lt;/P&gt;&lt;PRE&gt;&lt;CODE&gt;import pandas as pd
&amp;nbsp;
df = spark.table("imos_prior").toPandas()
df['Exposure']&lt;/CODE&gt;&lt;/PRE&gt;&lt;P&gt;Do you know how could specify the full path of my table ?&lt;/P&gt;&lt;P&gt;My table is located by default in default/imos_prior&lt;/P&gt;&lt;P&gt;&lt;/P&gt;&lt;P&gt;Or do you have any idea why the value could have change ? I hope is just a question of different file that is used by sparks.&lt;/P&gt;&lt;P&gt;&lt;/P&gt;&lt;P&gt;Thanks you&lt;/P&gt;&lt;P&gt;&lt;/P&gt;</description>
    <pubDate>Fri, 12 Nov 2021 21:15:18 GMT</pubDate>
    <dc:creator>Hola1801</dc:creator>
    <dc:date>2021-11-12T21:15:18Z</dc:date>
    <item>
      <title>Float Value change when Load with spark? Full Path?</title>
      <link>https://community.databricks.com/t5/data-engineering/float-value-change-when-load-with-spark-full-path/m-p/35148#M25820</link>
      <description>&lt;P&gt;Hello,&lt;/P&gt;&lt;P&gt;&lt;/P&gt;&lt;P&gt;I have created my table in Databricks, at this point everything is perfect i got the same value than in my CSV. &lt;/P&gt;&lt;P&gt;&lt;/P&gt;&lt;P&gt;for my column "Exposure" I have :&lt;/P&gt;&lt;P&gt;&lt;/P&gt;&lt;PRE&gt;&lt;CODE&gt;0         0,00
1         0,00
2         0,00
3         0,00
4         0,00
...&lt;/CODE&gt;&lt;/PRE&gt;&lt;P&gt;But when I load my file with spark, in the column exposure I have something different:&lt;/P&gt;&lt;PRE&gt;&lt;CODE&gt;0        14 032,24
1        14 032,24
2         8 061,94
3         8 061,94
4        15 506,37&lt;/CODE&gt;&lt;/PRE&gt;&lt;P&gt;I use this code to load by table:&lt;/P&gt;&lt;PRE&gt;&lt;CODE&gt;import pandas as pd
&amp;nbsp;
df = spark.table("imos_prior").toPandas()
df['Exposure']&lt;/CODE&gt;&lt;/PRE&gt;&lt;P&gt;Do you know how could specify the full path of my table ?&lt;/P&gt;&lt;P&gt;My table is located by default in default/imos_prior&lt;/P&gt;&lt;P&gt;&lt;/P&gt;&lt;P&gt;Or do you have any idea why the value could have change ? I hope is just a question of different file that is used by sparks.&lt;/P&gt;&lt;P&gt;&lt;/P&gt;&lt;P&gt;Thanks you&lt;/P&gt;&lt;P&gt;&lt;/P&gt;</description>
      <pubDate>Fri, 12 Nov 2021 21:15:18 GMT</pubDate>
      <guid>https://community.databricks.com/t5/data-engineering/float-value-change-when-load-with-spark-full-path/m-p/35148#M25820</guid>
      <dc:creator>Hola1801</dc:creator>
      <dc:date>2021-11-12T21:15:18Z</dc:date>
    </item>
    <item>
      <title>Re: Float Value change when Load with spark? Full Path?</title>
      <link>https://community.databricks.com/t5/data-engineering/float-value-change-when-load-with-spark-full-path/m-p/35149#M25821</link>
      <description>&lt;P&gt;Hello @Anis Ben Salem​&amp;nbsp;- My name is Piper and I'm a moderator for Databricks. Welcome and thank you for posting your question. Let's give it a bit longer for other members to respond. If we don't hear anything, we'll circle back around. &lt;/P&gt;</description>
      <pubDate>Sat, 13 Nov 2021 19:51:31 GMT</pubDate>
      <guid>https://community.databricks.com/t5/data-engineering/float-value-change-when-load-with-spark-full-path/m-p/35149#M25821</guid>
      <dc:creator>Anonymous</dc:creator>
      <dc:date>2021-11-13T19:51:31Z</dc:date>
    </item>
    <item>
      <title>Re: Float Value change when Load with spark? Full Path?</title>
      <link>https://community.databricks.com/t5/data-engineering/float-value-change-when-load-with-spark-full-path/m-p/35150#M25822</link>
      <description>&lt;P&gt;do you happen to know the path where the actual data resides?&lt;/P&gt;&lt;P&gt;Tables in databricks are not the actual data but like a view on top of the data (parquet, csv etc)&lt;/P&gt;</description>
      <pubDate>Mon, 15 Nov 2021 10:19:08 GMT</pubDate>
      <guid>https://community.databricks.com/t5/data-engineering/float-value-change-when-load-with-spark-full-path/m-p/35150#M25822</guid>
      <dc:creator>-werners-</dc:creator>
      <dc:date>2021-11-15T10:19:08Z</dc:date>
    </item>
    <item>
      <title>Re: Float Value change when Load with spark? Full Path?</title>
      <link>https://community.databricks.com/t5/data-engineering/float-value-change-when-load-with-spark-full-path/m-p/35151#M25823</link>
      <description>&lt;P&gt;Hi @Anis Ben Salem​&amp;nbsp;,&lt;/P&gt;&lt;P&gt;&lt;/P&gt;&lt;P&gt;How do you read your CSV file? do you use Pandas or Pyspark APIs? also, how do you created your table?&lt;/P&gt;&lt;P&gt;could you share more details on the code you are trying to run?&lt;/P&gt;&lt;P&gt;&lt;/P&gt;</description>
      <pubDate>Mon, 22 Nov 2021 22:57:12 GMT</pubDate>
      <guid>https://community.databricks.com/t5/data-engineering/float-value-change-when-load-with-spark-full-path/m-p/35151#M25823</guid>
      <dc:creator>jose_gonzalez</dc:creator>
      <dc:date>2021-11-22T22:57:12Z</dc:date>
    </item>
  </channel>
</rss>

