cancel
Showing results for 
Search instead for 
Did you mean: 
Data Engineering
Join discussions on data engineering best practices, architectures, and optimization strategies within the Databricks Community. Exchange insights and solutions with fellow data engineers.
cancel
Showing results for 
Search instead for 
Did you mean: 

Forum Posts

AkifCakir
by New Contributor II
  • 22655 Views
  • 3 replies
  • 4 kudos

Resolved! Why Spark Save Modes , "overwrite" always drops table although "truncate" is true ?

Hi Dear Team, I am trying to import data from databricks to Exasol DB. I am using following code in below with Spark version is 3.0.1 ,dfw.write \ .format("jdbc") \ .option("driver", exa_driver) \ .option("url", exa_url) \ .option("db...

  • 22655 Views
  • 3 replies
  • 4 kudos
Latest Reply
Gembo
New Contributor III
  • 4 kudos

@AkifCakir , Were you able to find a way to truncate without dropping the table using the .write function as I am facing the same issue as well.

  • 4 kudos
2 More Replies
raymund
by New Contributor III
  • 4045 Views
  • 7 replies
  • 5 kudos

Resolved! Why adding the package 'org.apache.spark:spark-sql-kafka-0-10_2.12:3.0.1' failed in runtime 9.1.x-scala2.12 but was successful using runtime 8.2.x-scala2.12 ?

Using Databricks spark submit job, setting new cluster1] "spark_version": "8.2.x-scala2.12" => OK, works fine2] "spark_version": "9.1.x-scala2.12" => FAIL, with errorsException in thread "main" java.lang.ExceptionInInitializerError at com.databricks...

  • 4045 Views
  • 7 replies
  • 5 kudos
Latest Reply
raymund
New Contributor III
  • 5 kudos

this has been resolved by adding the following spark_conf (not thru --conf) "spark.hadoop.fs.file.impl": "org.apache.hadoop.fs.LocalFileSystem"example:------"new_cluster": { "spark_version": "9.1.x-scala2.12", ... "spark_conf": { "spar...

  • 5 kudos
6 More Replies
dbu_spark
by New Contributor III
  • 7400 Views
  • 10 replies
  • 6 kudos

Older Spark Version loaded into the spark notebook

I have databricks runtime for a job set to latest 10.0 Beta (includes Apache Spark 3.2.0, Scala 2.12) .In the notebook when I check for the spark version, I see version 3.1.0 instead of version 3.2.0I need the Spark version 3.2 to process workloads a...

Screen Shot 2021-10-20 at 11.45.10 AM
  • 7400 Views
  • 10 replies
  • 6 kudos
Latest Reply
jose_gonzalez
Databricks Employee
  • 6 kudos

hi @Dhaivat Upadhyay​ ,Good news, DBR 10 was release yesterday October 20th. You can find more details in the release notes website

  • 6 kudos
9 More Replies
Labels