cancel
Showing results for 
Search instead for 
Did you mean: 
Community Discussions
Connect with fellow community members to discuss general topics related to the Databricks platform, industry trends, and best practices. Share experiences, ask questions, and foster collaboration within the community.
cancel
Showing results for 
Search instead for 
Did you mean: 

java.lang.NoSuchMethodError: com.amazonaws.services.s3.transfer.TransferManager.<init>(Lcom/amazonaw

RahuP
New Contributor II
 
2 REPLIES 2

Kaniz_Fatma
Community Manager
Community Manager

Hi @RahuPThe error message you’re encountering, java.lang.NoSuchMethodError: com.amazonaws.services.s3.transfer.TransferManager.<init>indicates a mismatch between the version of the AWS SDK for Java and the method being called.

Let’s break it down:

  1. AWS SDK for Java: This library provides tools and APIs for interacting with Amazon Web Services (AWS) services, including Amazon S3 (Simple Storage Service).
  2. TransferManager: A class within the SDK simplifies uploading and downloading data to/from Amazon S3. It manages multipart uploads, retries, and progress tracking.

Here are some steps to troubleshoot this issue:

  1. Check Dependencies:

    • Ensure that you’re using compatible versions of the AWS SDK for Java and other related libraries (such as Apache Spark or any other framework you’re using).
    • If you’re using Spark, make sure it’s compatible with your AWS SDK version.
  2. Version Compatibility:

    • The error might occur if the method signature (parameters) of TransferManager has changed between different versions of the SDK.
    • Verify that the version of the SDK you’re using matches the version your code expects.
  3. Dependency Management:

    • If you’re using Maven or Gradle, check your project’s dependencies and their versions.
    • Explicitly specify the version of the AWS SDK for Java in your build configuration.
  4. Sample Code:

    • Here’s an example of how to use TransferManager To upload data to Amazon S3:
    import com.amazonaws.auth.BasicAWSCredentials;
    import com.amazonaws.services.s3.AmazonS3;
    import com.amazonaws.services.s3.AmazonS3ClientBuilder;
    import com.amazonaws.services.s3.transfer.TransferManager;
    import com.amazonaws.services.s3.transfer.Upload;
    
    public class S3Uploader {
        public static void main(String[] args) {
            String accessKey = "your-access-key";
            String secretKey = "your-secret-key";
            String bucketName = "your-bucket-name";
            String filePath = "path/to/your/file.txt";
    
            BasicAWSCredentials credentials = new BasicAWSCredentials(accessKey, secretKey);
            AmazonS3 s3Client = AmazonS3ClientBuilder.standard().withCredentials(credentials).build();
    
            TransferManager transferManager = TransferManagerBuilder.standard().withS3Client(s3Client).build();
            Upload upload = transferManager.upload(bucketName, "file.txt", new File(filePath));
    
            try {
                upload.waitForCompletion();
                System.out.println("Upload completed successfully!");
            } catch (InterruptedException e) {
                e.printStackTrace();
            } finally {
                transferManager.shutdownNow();
            }
        }
    }
    

Remember to replace placeholders (your-access-key, your-secret-key, your-bucket-name, and path/to/your/file.txt) with your actual values.

If you’re still facing issues, please provide more context or specific code snippets, and I’ll be happy to assist further! 😊

RahuP
New Contributor II

@Kaniz_Fatma We are using spark 3.1.2 with hadoop-aws-2.7.4 and aws-java-sdk-s3-1.11.655 on databricks runtime 9.1LTS as mentioned in databricks documents supported version this should work. could you please help us to resolve this issue.

Join 100K+ Data Experts: Register Now & Grow with Us!

Excited to expand your horizons with us? Click here to Register and begin your journey to success!

Already a member? Login and join your local regional user group! If there isn’t one near you, fill out this form and we’ll create one for you to join!