cancel
Showing results forย 
Search instead forย 
Did you mean:ย 
Data Governance
Join discussions on data governance practices, compliance, and security within the Databricks Community. Exchange strategies and insights to ensure data integrity and regulatory compliance.
cancel
Showing results forย 
Search instead forย 
Did you mean:ย 

Where mount info get stored in external hive metastore?

Vgupta1
New Contributor

I have created a cluster which has advance options set for external hive metastore. I created a mount using sp for my data lake and then create a table using the mount . As per my knowledge the metadata of the table will be stored in sql db but i would like to know where the mount info will be stored. Will it be sqldb or in databricks?

furthur to this i created another workspace for another team and created a cluster with same advance option to link to external metastore. I havent created the mount in new workspace. If i will access the table will it work? If yes, then it means the mount info was in sql db. If not , then so i need to create mount with same name in the new workspace? What if the previous mount is pointing to prod and new workspace mount is pointing to dev, is it possible scenario.

as the new workspace team is separate team we dont want them to inherit our mount info from sql db.

3 REPLIES 3

karthik_p
Esteemed Contributor

@Vishal Guptaโ€‹ mount information won't be stored anywhere because it involves credentials. databricks does not store because of security reasons. please send command that you used, is that mnt command?

Prabakar
Databricks Employee
Databricks Employee

hi @Vishal Guptaโ€‹ The mount points are not common across the workspace. you can use the same mount name across all the workspaces. The mount path will be created under dbfs and if you do a ls, you will be able to see the mount paths. The credentials/config that you provide on the cluster will authenticate you to access the data via the mount path.

User16725394280
Contributor II

Hi @Vishal Guptaโ€‹ when we create a mount we store the mount info in a file in the DBFS root bucket.

Connect with Databricks Users in Your Area

Join a Regional User Group to connect with local Databricks users. Events will be happening in your city, and you wonโ€™t want to miss the chance to attend and share knowledge.

If there isnโ€™t a group near you, start one and help create a community that brings people together.

Request a New Group