cancel
Showing results for 
Search instead for 
Did you mean: 
Data Engineering
Join discussions on data engineering best practices, architectures, and optimization strategies within the Databricks Community. Exchange insights and solutions with fellow data engineers.
cancel
Showing results for 
Search instead for 
Did you mean: 

IllegalArgumentException: Mount failed due to invalid mount source

Mohamednazeer
New Contributor III

We are trying to create mount for containers from two different storage accounts. 

We are using Azure Storage Account and Azure Data bricks.

We could able to create mount for containers from one storage account, but when we try to create the mount for containers from other storage account we

are facing an excpetion as "IllegalArgumentException: Mount failed due to invalid mount source"

The only difference between the two storage account is, the storage account which we can able to create the mount and databricks are from same resource group. But the other storage account is from another resource group.

Could some one help me on this.

dbutils.fs.mount(source = f"wasbs://temp@samfdevpoc.blob.core.windows.net",mount_point = "/mnt/temp" ,extra_configs = {"fs.azure.account.key.samfdevpoc.blob.core.windows.net":"<use key from azure adls>"})

 

 

 

 

 

 

 

 

1 ACCEPTED SOLUTION

Accepted Solutions

Mohamednazeer
New Contributor III

Hi community,

The issue was becoz of cross vent access. The storage account and the databricks workspace both are in different vnet. Since that we had to create private end point to access the cross vnet resources. Once we crated the private endpoint the issue got resolved.

thanks

 

View solution in original post

3 REPLIES 3

Kaniz_Fatma
Community Manager
Community Manager

Hi @Mohamednazeer, Did you know that it's entirely feasible to mount multiple Azure Storage Accounts for various clusters within the same workspace? Not only is the mount permanent, but it's also done through the use of dbfs, meaning you only need to run it once. In the vast world of Azure, you have the ability to create not one, but two Databricks workspaces! And within each workspace, you can designate whether a cluster is for development or production through an environment variable. And the best part? The key vault can be shared between both workspaces. But wait, there's more! You can also make use of a handy config file (think .json or .conf) that can be uploaded to your workspace. This convenient feature allows you to easily select a different container depending on the environment you're working in. 

 

I hope this helps! Let me know if you have any other questions.

Hey there! Thanks a bunch for being part of our awesome community! 🎉 

We love having you around and appreciate all your questions. Take a moment to check out the responses – you'll find some great info. Your input is valuable, so pick the best solution for you. And remember, if you ever need more help , we're here for you! 

Keep being awesome! 😊🚀

 

Mohamednazeer
New Contributor III

Hi community,

The issue was becoz of cross vent access. The storage account and the databricks workspace both are in different vnet. Since that we had to create private end point to access the cross vnet resources. Once we crated the private endpoint the issue got resolved.

thanks

 

Connect with Databricks Users in Your Area

Join a Regional User Group to connect with local Databricks users. Events will be happening in your city, and you won’t want to miss the chance to attend and share knowledge.

If there isn’t a group near you, start one and help create a community that brings people together.

Request a New Group