cancel
Showing results for 
Search instead for 
Did you mean: 
Data Engineering
cancel
Showing results for 
Search instead for 
Did you mean: 

10.4LTS has outdated snowflake spark connector, how to force latest snowflake spark connector

DSam05
New Contributor

Hi,

I am trying to run my code from a scala fatjar on azure-databricks, which connects to snowflake for the data.

I usually run my jar on 9.1 LTS.

However when I run on 10.4 LTS the performace was 4x degraded and in the log it says

WARN SnowflakeConnectorUtils$: query pushdown is not supported because you are using Spark 3.2.1 with a connector designed to support Spark 3.1. Either use the version of spark supported by the connector or install a version of the connector that supports your version of spark.

I checked the driver logs and it seems 10.4 LTS has a pre-installed version of snowflake spark connector net.snowflake:spark-snowflake_2.12:2.9.0-spark_3.1 (also mentioned here)

Although I have put the latest snowflake spark connector(net.snowflake:spark-snowflake_2.13:2.10.0-spark_3.2) in my fatjar databricks picks the pre-installed one only.

Anyone has faced this issue ? knows a solution for this.

Thanks

3 REPLIES 3

Atanu
Esteemed Contributor
Esteemed Contributor

is lower DBR is working fine @Soumyajit Datta​  ?

Kaniz
Community Manager
Community Manager

Hi @Soumyajit Datta​ , We haven’t heard from you on the last response from @Atanu Sarkar​ , and I was checking back to see if you have a resolution yet. If you have any solution, please share it with the community as it can be helpful to others. Otherwise, we will respond with more details and try to help.

slavat
New Contributor II

I also encountered the similar problem. This is a snippet from my log file:

22/12/18 09:36:28 WARN SnowflakeConnectorUtils$: Query pushdown is not supported because you are using Spark 3.2.0 with a connector designed to support Spark 3.1. Either use the version of Spark supported by the connector or install a version of the connector that supports your version of Spark.
22/12/18 09:36:29 INFO SparkConnectorContext$: Spark Connector system config: {
  "spark_connector_version" : "2.9.0",
  "spark_version" : "3.2.0",
  "application_name" : "Databricks Shell",
  "scala_version" : "2.12.14",
  "java_version" : "11.0.12",
  "jdbc_version" : "3.13.3",
  "certified_jdbc_version" : "3.13.3",
  "os_name" : "Linux",
......
}

 Shouldn't the versionioning of Snowflake connector be transparent for me (I use the one supplied by Databricks)?

Welcome to Databricks Community: Lets learn, network and celebrate together

Join our fast-growing data practitioner and expert community of 80K+ members, ready to discover, help and collaborate together while making meaningful connections. 

Click here to register and join today! 

Engage in exciting technical discussions, join a group with your peers and meet our Featured Members.