Unable to connect/read files from ADLS Gen2 using account key
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
01-23-2023 08:59 AM
It gives error
[RequestId=5e57b66f-b69f-4e8b-8706-3fe5baeb77a0 ErrorClass=METASTORE_DOES_NOT_EXIST] No metastore assigned for the current workspace.
using the following code
spark.conf.set(
"fs.azure.account.key.mystorageaccount.dfs.core.windows.net",
dbutils.secrets.get(scope="keyvault-secret-scope2", key="ADLS-GEN2-SECRET-KEY"))
dbutils.fs.ls("abfss://containername@mystorageaccount.dfs.core.windows.net/")
mystorageaccount: ADLS Gen2 account name
containername: Container name
scope="keyvault-secret-scope2" : scope created in Azure Databricks with Manage "All users"
key="ADLS-GEN2-SECRET-KEY" : secret in Azure key vault, having the ADLS Gen2 access key
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
01-23-2023 11:03 AM
Is it a unity catalog workspace?
These pages can also help:
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
01-23-2023 11:46 AM
Hi @Farooq ur rehman ,
What's the Cluster security access mode? In the JSON of the cluster, the data_security_mode has to be set to LEGACY_SINGLE_USER. If this is set to SINGLE_USER, the cluster will look for a Unity Catalog metastore.
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
02-24-2023 03:36 PM
Hi @Farooq ur rehman,
Just a friendly follow-up. Did any of the responses help you to resolve your question? if it did, please mark it as best. Otherwise, please let us know if you still need help.

