cancel
Showing results for 
Search instead for 
Did you mean: 
Data Engineering
Join discussions on data engineering best practices, architectures, and optimization strategies within the Databricks Community. Exchange insights and solutions with fellow data engineers.
cancel
Showing results for 
Search instead for 
Did you mean: 

Forum Posts

duliu
by New Contributor II
  • 8848 Views
  • 6 replies
  • 0 kudos

databricks-connect fails with java.lang.IllegalStateException: No api token found in local properties

I configured databricks-connect locally and want to run spark code against a remote cluster.I verified `databricks-connect test` passes and it can connect to remote databricks cluster.However when I query a tables or read parquet from s3, it fails wi...

  • 8848 Views
  • 6 replies
  • 0 kudos
Latest Reply
Anonymous
Not applicable
  • 0 kudos

@Du Liu​ :The error message suggests that there is no API token found in the local properties. This could be the cause of the failure when trying to access the tables or read parquet files from S3.To fix this issue, you need to ensure that the API to...

  • 0 kudos
5 More Replies
User16826994223
by Honored Contributor III
  • 2513 Views
  • 1 replies
  • 0 kudos

Resolved! Change in spark code if I migrate from spark 2.4 to 3.0

I am thinking of migrating the spark 2.4 to 3.0, what should I know to take care of changes thet I need to look at while migrating

  • 2513 Views
  • 1 replies
  • 0 kudos
Latest Reply
User16826994223
Honored Contributor III
  • 0 kudos

I see there are many changes that need to take care of if you have used coding in spark 2.4 the changes are in Data set API StatementBuiltin UDF and functionsMore you can get from spark Documentation https://spark.apache.org/docs/latest/sql-migrat...

  • 0 kudos
alexott
by Databricks Employee
  • 2469 Views
  • 1 replies
  • 0 kudos

What libraries could be used for unit testing of the Spark code?

We need to add unit test cases for our code that we're writing using the Scala in Python. But we can't use the calls like `assertEqual` for comparing the content of DataFrames. Are any special libraries for that?

  • 2469 Views
  • 1 replies
  • 0 kudos
Latest Reply
alexott
Databricks Employee
  • 0 kudos

There are several libraries for Scala and Python that help with writing unit tests for Spark code.For Scala you can use following:Built-in Spark test suite - it's designed to test all parts of Spark. It supports RDD, Dataframe/Dataset, Streaming API...

  • 0 kudos
Labels