How this is possible? access error in storage, after selecting data from an delta IN THE STORAGE
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
02-06-2023 05:30 AM
Hi people, i`m using databricks Runtime 11.2 - Clusters Standard_F4 with Unity Catalog enabled and the service principals working,
I literally perform an select in two delta tables that is in the storage X,
the display() works, if i try to write an new delta table it works,
but when i perform an union() in these dataframes, the result dataframe generated give`s the error "Failure to initialize configurationInvalid configuration value detected for fs.azure.account.key" when i try an simple display, so, how the **** this is possible?
traceback
Py4JJavaError: An error occurred while calling o847.showString.
: Failure to initialize configurationInvalid configuration value detected for fs.azure.account.key
at shaded.databricks.azurebfs.org.apache.hadoop.fs.azurebfs.services.SimpleKeyProvider.getStorageAccountKey(SimpleKeyProvider.java:51)
at shaded.databricks.azurebfs.org.apache.hadoop.fs.azurebfs.AbfsConfiguration.getStorageAccountKey(AbfsConfiguration.java:593)
at shaded.databricks.azurebfs.org.apache.hadoop.fs.azurebfs.AzureBlobFileSystemStore.initializeClient(AzureBlobFileSystemStore.java:1847)
at shaded.databricks.azurebfs.org.apache.hadoop.fs.azurebfs.AzureBlobFileSystemStore.(AzureBlobFileSystemStore.java:227)
at shaded.databricks.azurebfs.org.apache.hadoop.fs.azurebfs.AzureBlobFileSystem.initialize(AzureBlobFileSystem.java:142)
at org.apache.hadoop.fs.FileSystem.createFileSystem(FileSystem.java:3469)
at org.apache.hadoop.fs.FileSystem.get(FileSystem.java:537)
at org.apache.hadoop.fs.Path.getFileSystem(Path.java:365)
at com.databricks.unity.SAM.createDelegate(SAM.scala:126)
at com.databricks.unity.SAM.createDelegate$(SAM.scala:119)
at com.databricks.unity.ClusterDefaultSAM$.createDelegate(SAM.scala:136)
at com.databricks.sql.acl.fs.CredentialScopeFileSystem.createDelegate(CredentialScopeFileSystem.scala:80)
at com.databricks.sql.acl.fs.CredentialScopeFileSystem.$anonfun$setDelegates$1(CredentialScopeFileSystem.scala:133)
at com.databricks.sql.acl.fs.Lazy.apply(DelegatingFileSystem.scala:286)
at com.databricks.sql.acl.fs.CredentialScopeFileSystem.getReadDelegate(CredentialScopeFileSystem.scala:108)
at com.databricks.backend.daemon.driver.DatabricksFileSystemHelper$.unwrapFileSystem(DatabricksFileSystemHelper.scala:52)
at com.databricks.tahoe.store.DelegatingLogStore.getDelegate(DelegatingLogStore.scala:179)
at com.databricks.tahoe.store.DelegatingLogStore.$anonfun$listFromUnsafe$1(DelegatingLogStore.scala:278)
at com.databricks.spark.util.FrameProfiler$.record(FrameProfiler.scala:80)
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
02-22-2023 02:14 PM
Hi,
The following article will help you to resolve this error message https://medium.com/@kyle.hale/troubleshooting-invalid-configuration-value-detected-for-fs-azure-acco...
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
04-08-2023 12:31 AM
Hi @Guilherme Cesar
Hope everything is going great.
Just wanted to check in if you were able to resolve your issue. If yes, would you be happy to mark an answer as best so that other members can find the solution more quickly? If not, please tell us so we can help you.
Cheers!