- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
10-13-2023 04:34 AM - edited 10-13-2023 04:37 AM
Hello,
I am facing issue while "Insert query or while .saveAsTable". The error is thrown by query is
Caused by: org.apache.hadoop.fs.UnsupportedFileSystemException: No FileSystem for scheme "s3"
org.apache.spark.SparkException: [TASK_WRITE_FAILED] Task failed while writing rows to s3://dbricksunitycatalog/
What configuration i am missing can you please help me?
I tried to add instance profile but it didn't work
Thanks,
Parimal
Accepted Solutions
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
10-24-2023 10:32 PM
Hello @Retired_mod ,
Thanks for the help.
We have also investigated internally, we have found the root cause of it.
Our products configuration overwriting the Databricks default spark.executor.extraclasspath confs. because of this our clusters was not able to find S3 related jars.
When we compared normal cluster and our product installed cluster, we have noticed this difference.
when we added our conf and databricks default conf. it started working.
11.3 LTS to 13.3 LTS these version were affected. it was working fine till 10.4 LTS.
Thanks,
Parimal
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
10-18-2023 03:07 AM
Hello @Retired_mod ,
We are still facing same issue.
We have tried your solution for unity catalog.
Also checked from role related config in AWS.
We are using 13.3 LTS, can you please tell us which aws jar should I install and what configuration I add in advance spark config.
Anything more can you please suggest to try out?
Thanks,
Parimal
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
10-24-2023 10:32 PM
Hello @Retired_mod ,
Thanks for the help.
We have also investigated internally, we have found the root cause of it.
Our products configuration overwriting the Databricks default spark.executor.extraclasspath confs. because of this our clusters was not able to find S3 related jars.
When we compared normal cluster and our product installed cluster, we have noticed this difference.
when we added our conf and databricks default conf. it started working.
11.3 LTS to 13.3 LTS these version were affected. it was working fine till 10.4 LTS.
Thanks,
Parimal

