- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
02-23-2025 11:48 PM
Can anybody know how can i connect with aws s3 object storage with databricks community edition or can i connect with community databricks account or not ?
Accepted Solutions
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
02-24-2025 10:47 AM
Hi @harsh_Dev ,
You can read from/write to AWS S3 with Databricks Community edition. As you will not be able to use instance profiles, you will need to configure the AWS credentials manually and access S3 using S3 URI. Try below code
spark._jsc.hadoopConfiguration().set("fs.s3a.access.key", "YOUR_ACCESS_KEY")
spark._jsc.hadoopConfiguration().set("fs.s3a.secret.key", "YOUR_SECRET_KEY")
df = spark.read.csv("s3a://your-bucket-name/path/to/yourfile.csv")
df.show()
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
02-24-2025 10:47 AM
Hi @harsh_Dev ,
You can read from/write to AWS S3 with Databricks Community edition. As you will not be able to use instance profiles, you will need to configure the AWS credentials manually and access S3 using S3 URI. Try below code
spark._jsc.hadoopConfiguration().set("fs.s3a.access.key", "YOUR_ACCESS_KEY")
spark._jsc.hadoopConfiguration().set("fs.s3a.secret.key", "YOUR_SECRET_KEY")
df = spark.read.csv("s3a://your-bucket-name/path/to/yourfile.csv")
df.show()
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
02-24-2025 09:33 PM
thanks for letting me know @KaranamS

