cancel
Showing results for 
Search instead for 
Did you mean: 
Data Governance
cancel
Showing results for 
Search instead for 
Did you mean: 

How would i upload file stream object to S3 bucket using pyspark?

rammy
Contributor III

I could able to save data using pyspark into S3 but not sure on how to save a file stream object into S3 bucket using pyspark. I could achieve this with help of python but when Unity catalog was enabled on Databrciks it always ends up with an access denied exception.

I added a screenshot and sample code here to check for the same

%python
import requests
import json
import io
import boto3
 
 
s3_client = boto3.client('s3')
r = s3_client.put_object(Body="response.content", Bucket="bucketName", Key="fileName")

My questions are

Why able to save data into an Amazon S3 bucket using Pyspak but not with Python?

When saving data is working with pyspark then why it is not working with python? If there is a reason for it how can we save a file using pyspark?

5 REPLIES 5

Senthil1
Contributor

Hi @rammy,

Only reason Access denied exception is happening because of,

Makes sure your unity catalog metastore tagged STORAGE has right set of permissions (storage credentials) to allow the WRITE object of S3. Also make sure to check your IAM role got tagged with this policy,

rammy
Contributor III

Thanks for your response @SENTHIL KUMARR MALLI SUDARSAN​ . I will try it out. If it is required to configure then why Pyspark allowing to write to S3?

Make sure METASTORE linked unity_catalog storage bucket you r not writing to. Also try to write in different bucket and show the exception what you r getting? Also tell which regions METASTORE bucket linked and which region BUCKET you r writing to

Kaniz
Community Manager
Community Manager

Hi @Ramesh Bathini​, We haven’t heard from you since the last response from @SENTHIL KUMARR MALLI SUDARSAN​​​, and I was checking back to see if his suggestions helped you.

Or else, If you have any solution, please share it with the community, as it can be helpful to others.

Also, Please don't forget to click on the "Select As Best" button whenever the information provided helps resolve your question.

rammy
Contributor III

I got to know that there is a change required at Unity-catalog to make it work with Python and got a recommendation to use pyspark to store file into S3.

I do not see much information about storing a file stream object in an S3 bucket anywhere. Can anyone share an example of it.

By the way will pyspark support .docx format to save?

Welcome to Databricks Community: Lets learn, network and celebrate together

Join our fast-growing data practitioner and expert community of 80K+ members, ready to discover, help and collaborate together while making meaningful connections. 

Click here to register and join today! 

Engage in exciting technical discussions, join a group with your peers and meet our Featured Members.