cancel
Showing results for 
Search instead for 
Did you mean: 
Data Governance
Join discussions on data governance practices, compliance, and security within the Databricks Community. Exchange strategies and insights to ensure data integrity and regulatory compliance.
cancel
Showing results for 
Search instead for 
Did you mean: 

job event trigger - Invalid credentials for storage location

AsphaltDataRide
New Contributor III

I want to use an event trigger to start a job.

-The MI has the Storage Blob Data Contributor role

-Test connection is successful at the level of the external locationimage.png-I have read permission on the external location

-I have owner permission on the job

-On the storage account: the access connector is added as resource instance to allow access.

But at the job level I get this error:

Invalid credentials for storage location abfss://***@***.dfs.core.windows.net/. The credentials for the external location in the Unity Catalog cannot be used to read the files from the configured path. Please grant the required permissions.

Any steps I missed in configuring this?

10 REPLIES 10

Anonymous
Not applicable

@Asphalt DR​ :

It seems like the credentials you provided for the external location in the Unity Catalog are not sufficient to read the files from the configured path. Here are some steps you can take to resolve the issue:

  1. Double-check that the MI has the correct permissions. Ensure that the MI has the necessary permissions to access the external location, including read access to the files in the configured path.
  2. Verify that the external location is correctly configured in the Unity Catalog. Make sure that the credentials provided in the Unity Catalog are correct and up-to-date.
  3. Check if there are any firewall or network settings that could be preventing access to the external location. Ensure that the network settings are properly configured to allow access to the external location.
  4. Verify that the access connector is properly configured in the storage account. Ensure that the access connector has the necessary permissions to access the external location.

thanks @Suteja Kanuri​ , sorry for the late reply.

  1. Double-check that the MI has the correct permissions. Ensure that the MI has the necessary permissions to access the external location, including read access to the files in the configured path. It has the right permissions. Otherwise the test connection would not work right?
  2. Verify that the external location is correctly configured in the Unity Catalog. Make sure that the credentials provided in the Unity Catalog are correct and up-to-date. It has the right configuration. Otherwise the test connection would not work I assume?
  3. Check if there are any firewall or network settings that could be preventing access to the external location. Ensure that the network settings are properly configured to allow access to the external location. Firewall should not be a problem. On top of this I added: image
  4. Verify that the access connector is properly configured in the storage account. Ensure that the access connector has the necessary permissions to access the external location.

Are there any other debugging options?

Anonymous
Not applicable

@Asphalt DR​ : Let me provide you more!

  1. Check the job logs: The job logs may contain additional information about the error. You can access the job logs by going to the job page in the Azure portal and clicking on "Logs" in the left-hand menu. Look for any error messages that may provide additional context.
  2. Test the credentials using Azure Storage Explorer: You can use Azure Storage Explorer to test the credentials you are using to access the external storage location. If the credentials work with Storage Explorer, then there may be an issue with the configuration of the event trigger or job.
  3. Check if the account key has been regenerated: If you have recently regenerated the account key for the storage account, ensure that the new key is being used in the job and event trigger configuration.
  4. Check if the storage account is in a different region than the job and event trigger: If the storage account is in a different region than the job and event trigger, there may be a delay in syncing the permissions. Wait for a few minutes and try again.
  5. Verify the event trigger configuration: Ensure that the event trigger is configured correctly, with the correct storage account and path specified. Check that the trigger is enabled and that it is set to the correct event type.
  6. Check if the file system is correctly configured: If the external storage location is an ADLS Gen2 file system, ensure that it is correctly configured. Check that the file system is not in a deleted state, and that the file system permissions are correctly set.

Priyag1
Honored Contributor II

Goor guidance

Anonymous
Not applicable

Hi @Asphalt DR​ 

Hope all is well! Just wanted to check in if you were able to resolve your issue and would you be happy to share the solution or mark an answer as best? Else please let us know if you need more help. 

We'd love to hear from you.

Thanks!

zesdatascience
New Contributor III

Same issue here, any solution?

Priyag1
Honored Contributor II

@Stuart Fish​ pleass check above solution

zesdatascience
New Contributor III

Thankyou, I have already reviewed the above solution. As for the customer above all the checks work and the external location tests OK, except when you add exactly the same path to the job as the trigger.

adriennn
Contributor II

It's a network error, if you check your storage account logs with

StorageBlobLogs
| where UserAgentHeader contains 'azsdk-java-azure-storage-file-datal'

You will see

StatusCode 403
StatusText IpAuthorizationError
CallerIpAddress 10.120.*.* // or some private ip not in your allowed vnet list

Seems that the file discovery is not using the ips allocated to the workspace, and even with the Access Connector whitelisted, the result is the same.

adriennn
Contributor II

for reference

https://stackoverflow.com/a/75906376/2842348

seems this could be made to work by allowing connectivity from Databricks' private vnets, the same way it is needed currently done for serverless setups if you have an environment that blocks public access.

Connect with Databricks Users in Your Area

Join a Regional User Group to connect with local Databricks users. Events will be happening in your city, and you won’t want to miss the chance to attend and share knowledge.

If there isn’t a group near you, start one and help create a community that brings people together.

Request a New Group