cancel
Showing results for 
Search instead for 
Did you mean: 
Data Engineering
Join discussions on data engineering best practices, architectures, and optimization strategies within the Databricks Community. Exchange insights and solutions with fellow data engineers.
cancel
Showing results for 
Search instead for 
Did you mean: 

NoSuchMethodError: com.google.common.util.concurrent.MoreExecutors.directExecutor()Ljava/util/concurrent/Executor

lily1
New Contributor III

When I execute a function in google-cloud-bigquery:2.7.0 jar, it executes a function in gax:2.12.2 jar and then this gax jar file executes a function in guava jar. And this guava jar file is a Databricks default library which is located at /databricks/jars/----workspace_spark_3_2--maven-trees--hive-2.3__hadoop-3.2--com.google.guava--guava--com.google.guava__guava__15.0.jar.

But the problem is this guava:15.0 jar file doesn't have the function directExecutor, it keeps giving me this error:

NoSuchMethodError: com.google.common.util.concurrent.MoreExecutors.directExecutor()Ljava/util/concurrent/Executor;
	at com.google.api.gax.retrying.BasicRetryingFuture.<init>(BasicRetryingFuture.java:88)
	at com.google.api.gax.retrying.DirectRetryingExecutor.createFuture(DirectRetryingExecutor.java:86)
	at com.google.api.gax.retrying.DirectRetryingExecutor.createFuture(DirectRetryingExecutor.java:73)
	at com.google.cloud.bigquery.BigQueryRetryHelper.run(BigQueryRetryHelper.java:85)
	at com.google.cloud.bigquery.BigQueryRetryHelper.runWithRetries(BigQueryRetryHelper.java:49)
	at com.google.cloud.bigquery.BigQueryImpl.create(BigQueryImpl.java:365)
	at com.google.cloud.bigquery.BigQueryImpl.create(BigQueryImpl.java:345)

I tried to remove this default guava jar file by following this document (https://kb.databricks.com/libraries/replace-default-jar-new-jar.html),

but once I did this,

I got this error below and couldn't start my cluster.

Cluster terminated.Reason:Spark error
 
Spark encountered an error on startup. This issue can be caused by invalid Spark configurations or malfunctioning init scripts. Please refer to the Spark driver logs to troubleshoot this issue, and contact Databricks if the problem persists.
 
Internal error message: Spark error: Driver down

In conclusion, my goal is TO UPGRADE THE GUAVA JAR FILE VERSION.

How could I fix this problem?

Thanks.

1 ACCEPTED SOLUTION

Accepted Solutions

lily1
New Contributor III

I solved this issue by following the KB article with guava-20.0 (guava-19.0 didn't work).

 But I appreciate your effort and reply.

View solution in original post

3 REPLIES 3

Anonymous
Not applicable

Hi,

It looks like you need a particular method "directExecutor" which doesn't exist on default cluster jars. Did you try to replace the jar or only remove the Guava 15 jar? If the latter, that would explain the cluster start failure.

Going through Guava API docs, the directExecutor() method shows up in later versions in the MoreExecutors class. For example, you can see it here in version 19 https://guava.dev/releases/19.0/api/docs/com/google/common/util/concurrent/MoreExecutors.htmland after in version 23: https://guava.dev/releases/23.0/api/docs/

Doing a little searching it looks like later version of this jar may break some backwards compatibility. The earliest version this method exists is in 18.

To ensure you replace the jar, you should try the following:

  1. Download maybe version 18 or 19. There should be links in github for it.
  2. Follow these directions to upload the new jar to DBFS: https://docs.databricks.com/libraries/workspace-libraries.html
  3. Follow the KB article you posted to create a cluster-scoped init script to remove the guava 15 jar and copy your new guava 18/19 jar to /databricks/jars/

Just like the warning said in the KB article, it will require testing and might not work due to dependencies and other jars' limitations.

lily1
New Contributor III

I solved this issue by following the KB article with guava-20.0 (guava-19.0 didn't work).

 But I appreciate your effort and reply.

Anonymous
Not applicable

Hey there @Lily Kim​ 

Hope you are doing well!

Thank you for posting your question. We are happy that you were able to find the solution.

Would you please like to mark the answer as best?

We'd love to hear from you.

Connect with Databricks Users in Your Area

Join a Regional User Group to connect with local Databricks users. Events will be happening in your city, and you won’t want to miss the chance to attend and share knowledge.

If there isn’t a group near you, start one and help create a community that brings people together.

Request a New Group