โ07-16-2023 08:48 AM
I have some file in S3, I want to process through Databricks, How it possible? Could you please help me regarding the same.
โ07-16-2023 11:13 PM
Hi @bchaubey ,
You can access S3 data in databricks by mounting your S3 bucket with databricks .
Please refer below documents that will help you to connect S3 with databricks.
Databricks S3 Integration: 3 Easy Steps (hevodata.com)
โ07-17-2023 12:35 AM
Databricks is actually made to connect to data lake (adls/s3) data.
There are several methods to read data in such a data lake. The easiest way is already mentioned by ajaypanday6781 (using mounts).
However Databricks advices against using mounts but instead using Unity Catalog.
https://docs.databricks.com/storage/amazon-s3.html
I would try to use the Unity Catalog method if it is possible for you as Unity is free and gives you some nice features.
The mount method however still works, but cannot be used in combination with Unity.
โ07-17-2023 09:40 PM
Hi @bchaubey
Hope everything is going great.
Just wanted to check in if you were able to resolve your issue. If yes, would you be happy to mark an answer as best so that other members can find the solution more quickly? If not, please tell us so we can help you.
Cheers!
โ07-17-2023 10:05 PM
access_key = dbutils.secrets.get(scope = "aws", key = "aws-access-key")
secret_key = dbutils.secrets.get(scope = "aws", key = "aws-secret-key")
encoded_secret_key = secret_key.replace("/", "%2F")
aws_bucket_name = "<aws-bucket-name>"
mount_name = "<mount-name>"
dbutils.fs.mount(f"s3a://{access_key}:{encoded_secret_key}@{aws_bucket_name}", f"/mnt/{mount_name}")
display(dbutils.fs.ls(f"/mnt/{mount_name}"))
Source: https://docs.databricks.com/dbfs/mounts.html
Passionate about hosting events and connecting people? Help us grow a vibrant local communityโsign up today to get started!
Sign Up Now