<?xml version="1.0" encoding="UTF-8"?>
<rss xmlns:content="http://purl.org/rss/1.0/modules/content/" xmlns:dc="http://purl.org/dc/elements/1.1/" xmlns:rdf="http://www.w3.org/1999/02/22-rdf-syntax-ns#" xmlns:taxo="http://purl.org/rss/1.0/modules/taxonomy/" version="2.0">
  <channel>
    <title>topic 10.4LTS has outdated snowflake spark connector, how to force latest snowflake spark connector in Data Engineering</title>
    <link>https://community.databricks.com/t5/data-engineering/10-4lts-has-outdated-snowflake-spark-connector-how-to-force/m-p/20282#M13668</link>
    <description>&lt;P&gt;Hi,&lt;/P&gt;&lt;P&gt;&lt;/P&gt;&lt;P&gt;I am trying to run my code from a scala fatjar on azure-databricks, which connects to snowflake for the data.&lt;/P&gt;&lt;P&gt;I usually run my jar on 9.1 LTS.&lt;/P&gt;&lt;P&gt;However when I run on 10.4 LTS the performace was 4x degraded and in the log it says &lt;/P&gt;&lt;PRE&gt;&lt;CODE&gt;WARN SnowflakeConnectorUtils$: query pushdown is not supported because you are using Spark 3.2.1 with a connector designed to support Spark 3.1. Either use the version of spark supported by the connector or install a version of the connector that supports your version of spark.&lt;/CODE&gt;&lt;/PRE&gt;&lt;P&gt;I checked the driver logs and it seems 10.4 LTS has a pre-installed version of snowflake spark connector net.snowflake:spark-snowflake_2.12:&lt;B&gt;2.9.0-spark_3.1&lt;/B&gt; (also mentioned &lt;A href="https://docs.microsoft.com/en-us/azure/databricks/release-notes/runtime/10.4#installed-java-and-scala-libraries-scala-212-cluster-version" alt="https://docs.microsoft.com/en-us/azure/databricks/release-notes/runtime/10.4#installed-java-and-scala-libraries-scala-212-cluster-version" target="_blank"&gt;here&lt;/A&gt;)&lt;/P&gt;&lt;P&gt;&lt;/P&gt;&lt;P&gt;Although I have put the latest snowflake spark connector(net.snowflake:spark-snowflake_2.13:2.10.0-spark_3.2) in my fatjar databricks picks the pre-installed one only.&lt;/P&gt;&lt;P&gt;&lt;/P&gt;&lt;P&gt;Anyone has faced this issue ? knows a solution for this.&lt;/P&gt;&lt;P&gt;&lt;/P&gt;&lt;P&gt;Thanks&lt;/P&gt;</description>
    <pubDate>Tue, 17 May 2022 19:54:10 GMT</pubDate>
    <dc:creator>DSam05</dc:creator>
    <dc:date>2022-05-17T19:54:10Z</dc:date>
    <item>
      <title>10.4LTS has outdated snowflake spark connector, how to force latest snowflake spark connector</title>
      <link>https://community.databricks.com/t5/data-engineering/10-4lts-has-outdated-snowflake-spark-connector-how-to-force/m-p/20282#M13668</link>
      <description>&lt;P&gt;Hi,&lt;/P&gt;&lt;P&gt;&lt;/P&gt;&lt;P&gt;I am trying to run my code from a scala fatjar on azure-databricks, which connects to snowflake for the data.&lt;/P&gt;&lt;P&gt;I usually run my jar on 9.1 LTS.&lt;/P&gt;&lt;P&gt;However when I run on 10.4 LTS the performace was 4x degraded and in the log it says &lt;/P&gt;&lt;PRE&gt;&lt;CODE&gt;WARN SnowflakeConnectorUtils$: query pushdown is not supported because you are using Spark 3.2.1 with a connector designed to support Spark 3.1. Either use the version of spark supported by the connector or install a version of the connector that supports your version of spark.&lt;/CODE&gt;&lt;/PRE&gt;&lt;P&gt;I checked the driver logs and it seems 10.4 LTS has a pre-installed version of snowflake spark connector net.snowflake:spark-snowflake_2.12:&lt;B&gt;2.9.0-spark_3.1&lt;/B&gt; (also mentioned &lt;A href="https://docs.microsoft.com/en-us/azure/databricks/release-notes/runtime/10.4#installed-java-and-scala-libraries-scala-212-cluster-version" alt="https://docs.microsoft.com/en-us/azure/databricks/release-notes/runtime/10.4#installed-java-and-scala-libraries-scala-212-cluster-version" target="_blank"&gt;here&lt;/A&gt;)&lt;/P&gt;&lt;P&gt;&lt;/P&gt;&lt;P&gt;Although I have put the latest snowflake spark connector(net.snowflake:spark-snowflake_2.13:2.10.0-spark_3.2) in my fatjar databricks picks the pre-installed one only.&lt;/P&gt;&lt;P&gt;&lt;/P&gt;&lt;P&gt;Anyone has faced this issue ? knows a solution for this.&lt;/P&gt;&lt;P&gt;&lt;/P&gt;&lt;P&gt;Thanks&lt;/P&gt;</description>
      <pubDate>Tue, 17 May 2022 19:54:10 GMT</pubDate>
      <guid>https://community.databricks.com/t5/data-engineering/10-4lts-has-outdated-snowflake-spark-connector-how-to-force/m-p/20282#M13668</guid>
      <dc:creator>DSam05</dc:creator>
      <dc:date>2022-05-17T19:54:10Z</dc:date>
    </item>
    <item>
      <title>Re: 10.4LTS has outdated snowflake spark connector, how to force latest snowflake spark connector</title>
      <link>https://community.databricks.com/t5/data-engineering/10-4lts-has-outdated-snowflake-spark-connector-how-to-force/m-p/20283#M13669</link>
      <description>&lt;P&gt;is lower DBR is working fine @Soumyajit Datta​&amp;nbsp; ?&lt;/P&gt;</description>
      <pubDate>Thu, 26 May 2022 09:42:02 GMT</pubDate>
      <guid>https://community.databricks.com/t5/data-engineering/10-4lts-has-outdated-snowflake-spark-connector-how-to-force/m-p/20283#M13669</guid>
      <dc:creator>Atanu</dc:creator>
      <dc:date>2022-05-26T09:42:02Z</dc:date>
    </item>
    <item>
      <title>Re: 10.4LTS has outdated snowflake spark connector, how to force latest snowflake spark connector</title>
      <link>https://community.databricks.com/t5/data-engineering/10-4lts-has-outdated-snowflake-spark-connector-how-to-force/m-p/20285#M13671</link>
      <description>&lt;P&gt;I also encountered the similar problem. This is a snippet from my log file:&lt;/P&gt;&lt;PRE&gt;&lt;CODE&gt;22/12/18 09:36:28 WARN SnowflakeConnectorUtils$: Query pushdown is not supported because you are using Spark 3.2.0 with a connector designed to support Spark 3.1. Either use the version of Spark supported by the connector or install a version of the connector that supports your version of Spark.
22/12/18 09:36:29 INFO SparkConnectorContext$: Spark Connector system config: {
  "spark_connector_version" : "2.9.0",
  "spark_version" : "3.2.0",
  "application_name" : "Databricks Shell",
  "scala_version" : "2.12.14",
  "java_version" : "11.0.12",
  "jdbc_version" : "3.13.3",
  "certified_jdbc_version" : "3.13.3",
  "os_name" : "Linux",
......
}&lt;/CODE&gt;&lt;/PRE&gt;&lt;P&gt;&amp;nbsp;Shouldn't the versionioning of Snowflake connector be transparent for me (I use the one supplied by Databricks)? &lt;/P&gt;</description>
      <pubDate>Sun, 08 Jan 2023 11:38:15 GMT</pubDate>
      <guid>https://community.databricks.com/t5/data-engineering/10-4lts-has-outdated-snowflake-spark-connector-how-to-force/m-p/20285#M13671</guid>
      <dc:creator>slavat</dc:creator>
      <dc:date>2023-01-08T11:38:15Z</dc:date>
    </item>
  </channel>
</rss>

