cancel
Showing results for 
Search instead for 
Did you mean: 
Data Engineering
Join discussions on data engineering best practices, architectures, and optimization strategies within the Databricks Community. Exchange insights and solutions with fellow data engineers.
cancel
Showing results for 
Search instead for 
Did you mean: 

using pyspark can I write to an s3 path I don't have GetObject permission to?

gdoron
New Contributor

After spark finishes writing the dataframe to S3, it seems like it checks the validity of the files it wrote with: `getFileStatus` that is `HeadObject`

 behind the scenes.

What if I'm only granted write and list objects permissions but not GetObject? Is there any way instructing pyspark on databricks to not do this validity test after a successful write?

2 REPLIES 2

-werners-
Esteemed Contributor III

writing without the check could lead to corrupt data, so I doubt this is possible.

Lakshay
Databricks Employee
Databricks Employee

It is not possible in my opinion.

Connect with Databricks Users in Your Area

Join a Regional User Group to connect with local Databricks users. Events will be happening in your city, and you won’t want to miss the chance to attend and share knowledge.

If there isn’t a group near you, start one and help create a community that brings people together.

Request a New Group