cancel
Showing results for 
Search instead for 
Did you mean: 
Data Engineering
cancel
Showing results for 
Search instead for 
Did you mean: 

Move Files from S3 to Local File System with Unity Catalog Enabled

madhav_dhruve
New Contributor III

Dear Databricks Community Experts,

I am working on databricks on AWS with unity catalog.

One usecase for me is to uncompress files with many extensions there on S3 Bucket.

Below is my strategy:-

  1. Move files from S3 to Local file system (where spark driver is running) via dbutils.fs.mv(dbfs_file, local_file)
  2. Uncompress files via shell commands or packages in python
  3. Move back uncompressed files to S3 via dbutils

Here,

dbfs_file ==> s3://path_to_file or dbfs://path_to_file (I am using unit catalog and not mounting method)

local_file ==> file:///tmp/path_to_file,

When i use dbutils.fs.cp(dbfs_file, local_file), I am getting error as below - 

ERROR - java.lang.SecurityException: Cannot use com.databricks.backend.daemon.driver.WorkspaceLocalFileSystem - local filesystem access is forbidden

How to apply write permissions and resolve this issue.

Screenshot 2023-07-18 at 10.57.19 AM.png

 

1 REPLY 1

rvadali2
New Contributor II

did you find a solution to this?

 

Welcome to Databricks Community: Lets learn, network and celebrate together

Join our fast-growing data practitioner and expert community of 80K+ members, ready to discover, help and collaborate together while making meaningful connections. 

Click here to register and join today! 

Engage in exciting technical discussions, join a group with your peers and meet our Featured Members.