cancel
Showing results forย 
Search instead forย 
Did you mean:ย 
Data Engineering
Join discussions on data engineering best practices, architectures, and optimization strategies within the Databricks Community. Exchange insights and solutions with fellow data engineers.
cancel
Showing results forย 
Search instead forย 
Did you mean:ย 

Caused by: org.apache.hadoop.fs.UnsupportedFileSystemException: No FileSystem for scheme "s3"

parimalpatil28
New Contributor III

Hello,

I am facing issue while "Insert query or while .saveAsTable". The error is thrown by query is 
Caused by: org.apache.hadoop.fs.UnsupportedFileSystemException: No FileSystem for scheme "s3"

org.apache.spark.SparkException: [TASK_WRITE_FAILED] Task failed while writing rows to s3://dbricksunitycatalog/

 

What configuration i am missing can you please help me?

I tried to add instance profile but it didn't work

 

Thanks,

Parimal

1 ACCEPTED SOLUTION

Accepted Solutions

parimalpatil28
New Contributor III

Hello @Retired_mod ,

Thanks for the help.
We have also investigated internally, we have found the root cause of it.

Our products configuration overwriting the Databricks default spark.executor.extraclasspath confs. because of this our clusters was not able to find S3 related jars.
When we compared normal cluster and our product installed cluster, we have noticed this difference.
when we added our conf and databricks default conf. it started working.

11.3 LTS to 13.3 LTS these version were affected.  it was working fine till 10.4 LTS.

Thanks,

Parimal

View solution in original post

2 REPLIES 2

Hello @Retired_mod ,

We are still facing same issue.
We have tried your solution for unity catalog.
Also checked from role related config in AWS.

We are using 13.3 LTS, can you please tell us which aws jar should I install and what configuration I add in advance spark config.

Anything more can you please suggest to try out?

 

Thanks,

Parimal

 

 

parimalpatil28
New Contributor III

Hello @Retired_mod ,

Thanks for the help.
We have also investigated internally, we have found the root cause of it.

Our products configuration overwriting the Databricks default spark.executor.extraclasspath confs. because of this our clusters was not able to find S3 related jars.
When we compared normal cluster and our product installed cluster, we have noticed this difference.
when we added our conf and databricks default conf. it started working.

11.3 LTS to 13.3 LTS these version were affected.  it was working fine till 10.4 LTS.

Thanks,

Parimal

Connect with Databricks Users in Your Area

Join a Regional User Group to connect with local Databricks users. Events will be happening in your city, and you wonโ€™t want to miss the chance to attend and share knowledge.

If there isnโ€™t a group near you, start one and help create a community that brings people together.

Request a New Group