cancel
Showing results forย 
Search instead forย 
Did you mean:ย 
Data Engineering
cancel
Showing results forย 
Search instead forย 
Did you mean:ย 

using pyspark can I write to an s3 path I don't have GetObject permission to?

gdoron
New Contributor

After spark finishes writing the dataframe to S3, it seems like it checks the validity of the files it wrote with: `getFileStatus` that is `HeadObject`

 behind the scenes.

What if I'm only granted write and list objects permissions but not GetObject? Is there any way instructing pyspark on databricks to not do this validity test after a successful write?

2 REPLIES 2

-werners-
Esteemed Contributor III

writing without the check could lead to corrupt data, so I doubt this is possible.

Lakshay
Esteemed Contributor
Esteemed Contributor

It is not possible in my opinion.

Welcome to Databricks Community: Lets learn, network and celebrate together

Join our fast-growing data practitioner and expert community of 80K+ members, ready to discover, help and collaborate together while making meaningful connections. 

Click here to register and join today! 

Engage in exciting technical discussions, join a group with your peers and meet our Featured Members.