cancel
Showing results forย 
Search instead forย 
Did you mean:ย 
Data Engineering
cancel
Showing results forย 
Search instead forย 
Did you mean:ย 

Delete a file from GCS folder

MBV3
New Contributor III

What is the best way to delete files from the gcp bucket inside spark job?

1 REPLY 1

Unforgiven
Valued Contributor III

@M Baigโ€‹ 

yes you need just to create service account for databricks and than assign storage admin role to bucket. After that you can mount GCS standard way:

  1. bucket_name = "<bucket-name>"
  2. mount_name = "<mount-name>"
  3. dbutils.fs.mount("gs://%s" % bucket_name, "/mnt/%s" % mount_name)
  4. display(dbutils.fs.ls("/mnt/%s" % mount_name))

here link for ur refer before process, hope some info help on this case

https://docs.gcp.databricks.com/external-data/gcs.html

cheer

Welcome to Databricks Community: Lets learn, network and celebrate together

Join our fast-growing data practitioner and expert community of 80K+ members, ready to discover, help and collaborate together while making meaningful connections. 

Click here to register and join today! 

Engage in exciting technical discussions, join a group with your peers and meet our Featured Members.