03-23-2023 02:36 AM
I want to use an event trigger to start a job.
-The MI has the Storage Blob Data Contributor role
-Test connection is successful at the level of the external location-I have read permission on the external location
-I have owner permission on the job
-On the storage account: the access connector is added as resource instance to allow access.
But at the job level I get this error:
Invalid credentials for storage location abfss://***@***.dfs.core.windows.net/. The credentials for the external location in the Unity Catalog cannot be used to read the files from the configured path. Please grant the required permissions.
Any steps I missed in configuring this?
03-24-2023 11:45 PM
@Asphalt DR :
It seems like the credentials you provided for the external location in the Unity Catalog are not sufficient to read the files from the configured path. Here are some steps you can take to resolve the issue:
03-30-2023 07:13 AM
thanks @Suteja Kanuri , sorry for the late reply.
Are there any other debugging options?
04-01-2023 08:57 PM
@Asphalt DR : Let me provide you more!
05-02-2023 07:31 PM
Goor guidance
03-27-2023 10:11 PM
Hi @Asphalt DR
Hope all is well! Just wanted to check in if you were able to resolve your issue and would you be happy to share the solution or mark an answer as best? Else please let us know if you need more help.
We'd love to hear from you.
Thanks!
05-02-2023 06:43 PM
Same issue here, any solution?
05-02-2023 07:29 PM
@Stuart Fish pleass check above solution
05-02-2023 07:46 PM
Thankyou, I have already reviewed the above solution. As for the customer above all the checks work and the external location tests OK, except when you add exactly the same path to the job as the trigger.
09-12-2023 05:54 AM
It's a network error, if you check your storage account logs with
StorageBlobLogs
| where UserAgentHeader contains 'azsdk-java-azure-storage-file-datal'
You will see
StatusCode 403
StatusText IpAuthorizationError
CallerIpAddress 10.120.*.* // or some private ip not in your allowed vnet list
Seems that the file discovery is not using the ips allocated to the workspace, and even with the Access Connector whitelisted, the result is the same.
09-12-2023 11:30 PM
for reference
https://stackoverflow.com/a/75906376/2842348
seems this could be made to work by allowing connectivity from Databricks' private vnets, the same way it is needed currently done for serverless setups if you have an environment that blocks public access.
Join a Regional User Group to connect with local Databricks users. Events will be happening in your city, and you won’t want to miss the chance to attend and share knowledge.
If there isn’t a group near you, start one and help create a community that brings people together.
Request a New Group