Dear Databricks Community Experts,
I am working on databricks on AWS with unity catalog.
One usecase for me is to uncompress files with many extensions there on S3 Bucket.
Below is my strategy:-
- Move files from S3 to Local file system (where spark driver is running) via dbutils.fs.mv(dbfs_file, local_file)
- Uncompress files via shell commands or packages in python
- Move back uncompressed files to S3 via dbutils
Here,
dbfs_file ==> s3://path_to_file or dbfs://path_to_file (I am using unit catalog and not mounting method)
local_file ==> file:///tmp/path_to_file,
When i use dbutils.fs.cp(dbfs_file, local_file), I am getting error as below -
ERROR - java.lang.SecurityException: Cannot use com.databricks.backend.daemon.driver.WorkspaceLocalFileSystem - local filesystem access is forbidden
How to apply write permissions and resolve this issue.