cancel
Showing results for 
Search instead for 
Did you mean: 
Data Engineering
Join discussions on data engineering best practices, architectures, and optimization strategies within the Databricks Community. Exchange insights and solutions with fellow data engineers.
cancel
Showing results for 
Search instead for 
Did you mean: 

Why is mounts = dbutils.fs.mounts () not available now?

zmsoft
New Contributor III

Hi there,

I am currently using cluster version 15.4 LTS with UC enabled . Azure Data Lake Storage Gen2 has enabled hierarchical namespaces.

I tried the following three ways to mount external storage and all got an error

  1. Mount point via the ADLS Gen2 access key
  2. Mount point via the Microsoft Entra ID
  3. Create external location using Unity Catalog

Error msg:

Py4JError: An error occurred while calling o444.mounts. Trace: py4j.security.Py4JSecurityException: Method public com.databricks.backend.daemon.dbutils.DBUtilsCore$Result com.databricks.backend.daemon.dbutils.DBUtilsCore.mounts() is not whitelisted on class class com.databricks.backend.daemon.dbutils.DBUtilsCore
[RequestId=5577be6a-0230-4213-9169-19d3fbb8630c ErrorClass=INVALID_PARAMETER_VALUE.HIERARCHICAL_NAMESPACE_NOT_ENABLED] The Azure storage account does not have hierarchical namespace enabled.
Single Read-only File
This External Location is a single file, not a directory. The associated Storage Credential grants permission to read its content.

 

Thanks & Regards,

zmsoft

 

 

 

 

 

1 ACCEPTED SOLUTION

Accepted Solutions

Panda
Valued Contributor

@zmsoft 

Unity Catalog (UC) enforces strict access control policies, and traditional mounting techniques—such as using access keys or the dbutils.fs.mount command—are not recommended. Best practices for DBFS and Unity Catalog.

Databricks advises against using DBFS mounts for external data sources when working with Unity Catalog. Instead, it's recommended to use Unity Catalog's external locations/Volumnes and storage credentials to manage data access, providing a more secure and governed approach.

Would recomand follow the apparch which @filipniziol mentioned.

View solution in original post

4 REPLIES 4

ShresthaBaburam
New Contributor III
Hi @zmsoft 
The error message you’re encountering:
1. Hierarchical Namespace Not Enabled: This occurs when you attempt to use a databricks file system (DBFS) command like mounting, but the Azure Data Lake Storage Gen2 (ADLS Gen2) you’re working with does not have hierarchical namespace (HNS) enabled. Hierarchical namespace is a feature of ADLS Gen2 that allows directories and file operations.
2. Single Read-only File:  The error also indicates that the location you are trying to access is not a directory but rather a single file, and your credentials grant only read-only access to it.
 
Possible Solutions
First, Enable Hierarchical Namespace on ADLS Gen2: If you're working with an ADLS Gen2 account, it must have hierarchical namespace (HNS) enabled to support file and directory operations like mounting. You can check and enable it during the creation of the storage account, but you cannot enable HNS on an existing storage account.  If possible, create a new ADLS Gen2 account with hierarchical namespace enabled. When creating a storage account, ensure to check Enable hierarchical namespace.
Second, Access the File Directly Without Mounting: Since the external location is a single file and not a directory, you may not need to mount the storage location. Instead, you can directly access the file using the file path in Azure.
ShresthaBaburam

zmsoft
New Contributor III

Hi @ShresthaBaburam ,

Thank you for your response.  My storage account has hierarchical namespaces enabled, and I have chose a directory not a file.

 

Hi @zmsoft ,

Since you are working in the unity catalog environment, go with External Location or Volume.

Please read this article:
https://community.databricks.com/t5/technical-blog/how-to-migrate-from-mount-points-to-unity-catalog...

As the first step you need to configure a storage credential and to assign proper permissions to your storage credential (Storage Blob Data Contributor role to User Assigned Managed Identity configured in your storage credential).

filipniziol_0-1729061349832.png

 

Panda
Valued Contributor

@zmsoft 

Unity Catalog (UC) enforces strict access control policies, and traditional mounting techniques—such as using access keys or the dbutils.fs.mount command—are not recommended. Best practices for DBFS and Unity Catalog.

Databricks advises against using DBFS mounts for external data sources when working with Unity Catalog. Instead, it's recommended to use Unity Catalog's external locations/Volumnes and storage credentials to manage data access, providing a more secure and governed approach.

Would recomand follow the apparch which @filipniziol mentioned.

Connect with Databricks Users in Your Area

Join a Regional User Group to connect with local Databricks users. Events will be happening in your city, and you won’t want to miss the chance to attend and share knowledge.

If there isn’t a group near you, start one and help create a community that brings people together.

Request a New Group