- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
10-15-2024 07:14 PM
Hi there,
I am currently using cluster version 15.4 LTS with UC enabled . Azure Data Lake Storage Gen2 has enabled hierarchical namespaces.
I tried the following three ways to mount external storage and all got an error
- Mount point via the ADLS Gen2 access key
- Mount point via the Microsoft Entra ID
- Create external location using Unity Catalog
Error msg:
Py4JError: An error occurred while calling o444.mounts. Trace: py4j.security.Py4JSecurityException: Method public com.databricks.backend.daemon.dbutils.DBUtilsCore$Result com.databricks.backend.daemon.dbutils.DBUtilsCore.mounts() is not whitelisted on class class com.databricks.backend.daemon.dbutils.DBUtilsCore
[RequestId=5577be6a-0230-4213-9169-19d3fbb8630c ErrorClass=INVALID_PARAMETER_VALUE.HIERARCHICAL_NAMESPACE_NOT_ENABLED] The Azure storage account does not have hierarchical namespace enabled.
Single Read-only File
This External Location is a single file, not a directory. The associated Storage Credential grants permission to read its content.
Thanks & Regards,
zmsoft
Accepted Solutions
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
10-17-2024 02:34 AM
Unity Catalog (UC) enforces strict access control policies, and traditional mounting techniques—such as using access keys or the dbutils.fs.mount command—are not recommended. Best practices for DBFS and Unity Catalog.
Databricks advises against using DBFS mounts for external data sources when working with Unity Catalog. Instead, it's recommended to use Unity Catalog's external locations/Volumnes and storage credentials to manage data access, providing a more secure and governed approach.
Would recomand follow the apparch which @filipniziol mentioned.
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
10-15-2024 08:35 PM
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
10-15-2024 10:19 PM
Hi @ShresthaBaburam ,
Thank you for your response. My storage account has hierarchical namespaces enabled, and I have chose a directory not a file.
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
10-15-2024 11:57 PM - edited 10-15-2024 11:58 PM
Hi @zmsoft ,
Since you are working in the unity catalog environment, go with External Location or Volume.
Please read this article:
https://community.databricks.com/t5/technical-blog/how-to-migrate-from-mount-points-to-unity-catalog...
As the first step you need to configure a storage credential and to assign proper permissions to your storage credential (Storage Blob Data Contributor role to User Assigned Managed Identity configured in your storage credential).
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
10-17-2024 02:34 AM
Unity Catalog (UC) enforces strict access control policies, and traditional mounting techniques—such as using access keys or the dbutils.fs.mount command—are not recommended. Best practices for DBFS and Unity Catalog.
Databricks advises against using DBFS mounts for external data sources when working with Unity Catalog. Instead, it's recommended to use Unity Catalog's external locations/Volumnes and storage credentials to manage data access, providing a more secure and governed approach.
Would recomand follow the apparch which @filipniziol mentioned.

