Options
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
03-11-2022 12:20 PM
1 ACCEPTED SOLUTION
Accepted Solutions
Options
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
03-12-2022 02:51 AM
Yes you need just to create service account for databricks and than assign storage admin role to bucket. After that you can mount GCS standard way:
bucket_name = "<bucket-name>"
mount_name = "<mount-name>"
dbutils.fs.mount("gs://%s" % bucket_name, "/mnt/%s" % mount_name)
display(dbutils.fs.ls("/mnt/%s" % mount_name))
more info here https://docs.gcp.databricks.com/data/data-sources/google/gcs.html
1 REPLY 1
Options
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
03-12-2022 02:51 AM
Yes you need just to create service account for databricks and than assign storage admin role to bucket. After that you can mount GCS standard way:
bucket_name = "<bucket-name>"
mount_name = "<mount-name>"
dbutils.fs.mount("gs://%s" % bucket_name, "/mnt/%s" % mount_name)
display(dbutils.fs.ls("/mnt/%s" % mount_name))
more info here https://docs.gcp.databricks.com/data/data-sources/google/gcs.html

