Hey @Mailendiran
In Databricks, mounting storage to DBFS (Databricks File System) using the `abfss` protocol is a common practice for accessing data stored externally in Azure Blob Storage. While you typically use the full `abfss` path to access data, you can indeed simplify this process by mounting the storage and accessing it via a mount point.
- You can mount the Azure Blob Storage to DBFS using the `dbutils.fs.mount` command.
- Specify the mount point (a directory in DBFS) and provide the Azure Blob Storage URI.
to do this create a notebook in databricks environment and use the below commands after updating your container name and other required things:
PYTHON
storage_account_name = "your_storage_account_name"
container_name = "your_container_name"
mount_point = "/mnt/storage"
dbutils.fs.mount(
source = f"abfss://$storage_account_name@{storage_account_name}.dfs.core.windows.net/{container_name}",
mount_point = mount_point,
extra_configs = {"fs.azure.account.key."+storage_account_name+".dfs.core.windows.net":dbutils.secrets.get(scope = "your_scope_name", key = "your_storage_key")})
to access data now you can use:
data_df = spark.read.csv("/mnt/storage/path_to_your_data.csv")
Hope this helps, thanks for posting.
Leave a like if this helps! Kudos,
Palash