cancel
Showing results for 
Search instead for 
Did you mean: 
Data Engineering
Join discussions on data engineering best practices, architectures, and optimization strategies within the Databricks Community. Exchange insights and solutions with fellow data engineers.
cancel
Showing results for 
Search instead for 
Did you mean: 

Forum Posts

DSam05
by New Contributor
  • 1864 Views
  • 3 replies
  • 1 kudos

10.4LTS has outdated snowflake spark connector, how to force latest snowflake spark connector

Hi,I am trying to run my code from a scala fatjar on azure-databricks, which connects to snowflake for the data.I usually run my jar on 9.1 LTS.However when I run on 10.4 LTS the performace was 4x degraded and in the log it says WARN SnowflakeConnect...

  • 1864 Views
  • 3 replies
  • 1 kudos
Latest Reply
slavat
New Contributor II
  • 1 kudos

I also encountered the similar problem. This is a snippet from my log file:22/12/18 09:36:28 WARN SnowflakeConnectorUtils$: Query pushdown is not supported because you are using Spark 3.2.0 with a connector designed to support Spark 3.1. Either use t...

  • 1 kudos
2 More Replies
mick042
by New Contributor III
  • 1093 Views
  • 1 replies
  • 0 kudos

Does spark utilise a temporary stage when writing to snowflake? How does that work?

Folks , when I want to push data to snowflake I need to use a stage for files before copying data over. However, when I utilise the net.snowflake.spark.snowflake.Utils library and do a spark.write as in...spark.read.format("csv") .option("header", ...

  • 1093 Views
  • 1 replies
  • 0 kudos
Latest Reply
mick042
New Contributor III
  • 0 kudos

Yes it uses a temporary stage. should have just looked in snowflake history

  • 0 kudos
Labels