cancel
Showing results forย 
Search instead forย 
Did you mean:ย 
Data Engineering
cancel
Showing results forย 
Search instead forย 
Did you mean:ย 

Data Pull from S3

bchaubey
Contributor II

I have some file in S3, I want to process through Databricks, How it possible? Could you please help me regarding the same.

4 REPLIES 4

Ajay-Pandey
Esteemed Contributor III

Hi @bchaubey ,

You can access S3 data in databricks by mounting your S3 bucket with databricks .

Please refer below documents that will help you to connect S3 with databricks.

Databricks S3 Integration: 3 Easy Steps (hevodata.com)

-werners-
Esteemed Contributor III

Databricks is actually made to connect to data lake (adls/s3) data.
There are several methods to read data in such a data lake.  The easiest way is already mentioned by ajaypanday6781 (using mounts).
However Databricks advices against using mounts but instead using Unity Catalog.
https://docs.databricks.com/storage/amazon-s3.html

I would try to use the Unity Catalog method if it is possible for you as Unity is free and gives you some nice features.
The mount method however still works, but cannot be used in combination with Unity.

Anonymous
Not applicable

 Hi @bchaubey 

Hope everything is going great.

Just wanted to check in if you were able to resolve your issue. If yes, would you be happy to mark an answer as best so that other members can find the solution more quickly? If not, please tell us so we can help you. 

Cheers!

dream
New Contributor III
access_key = dbutils.secrets.get(scope = "aws", key = "aws-access-key")
secret_key = dbutils.secrets.get(scope = "aws", key = "aws-secret-key")
encoded_secret_key = secret_key.replace("/", "%2F")
aws_bucket_name = "<aws-bucket-name>"
mount_name = "<mount-name>"

dbutils.fs.mount(f"s3a://{access_key}:{encoded_secret_key}@{aws_bucket_name}", f"/mnt/{mount_name}")
display(dbutils.fs.ls(f"/mnt/{mount_name}"))

Source: https://docs.databricks.com/dbfs/mounts.html