cancel
Showing results for 
Search instead for 
Did you mean: 
Data Engineering
cancel
Showing results for 
Search instead for 
Did you mean: 

Error creating external location in Unity Catalog

Gustavo_Az
Contributor

Hello

When I try to create an external location I get this error:

Failed to access cloud storage: [AbfsRestOperationException] HTTP Error -1CustomTokenProvider getAccessToken threw com.databricks.api.base.DatabricksServiceException : INTERNAL_ERROR: Unhandled error in API call exceptionTraceId=ef19994f-ba84-422b-8a08-3086b92c05a0

I am perfectly capable of access and read and write from/to that container, using a databricks notebook, Azure Storage Explorer or the Azure web. I used the root path of the container, for the external location:

"abfss://silver@<storage_account>.dfs.core.windows.net/"

I am using databricks runtime 11.3 LTS, created an Access Connector for Azure Databricks with "Storage Blob Data Contributor" role on the storage account, my account is admin in Azure subscription and in Databricks account.

This error doesnt tell me much other than it has been an internal error, but I would like to know how could I solve it or what could be causing it

Thank you.

1 ACCEPTED SOLUTION

Accepted Solutions

Anonymous
Not applicable

@Gustavo Amadoz Navarro​ :

The error message suggests that there was an unhandled error in the API call and the trace ID can be used to track down the root cause of the issue. The error could be related to the authentication or authorization to access the cloud storage or there could be an issue with the configuration of the storage account or the external location.

Here are some steps you can take to troubleshoot the issue:

  1. Check the trace ID: The trace ID mentioned in the error message can be used to find the exact error in the Databricks logs. You can search for this trace ID in the Databricks logs to find more information about the error.
  2. Check the access token: Make sure that the access token used to authenticate to the cloud storage is valid and has the required permissions to access the storage account.
  3. Check the external location configuration: Verify that the configuration of the external location is correct, including the storage account name, container name, and root path.
  4. Check the firewall settings: Ensure that the firewall settings for the storage account allow access from the Azure Databricks workspace.
  5. Check the network connectivity: Check the network connectivity between the Azure Databricks workspace and the cloud storage. Try accessing the storage account from a different network to see if the issue persists.
  6. Upgrade to the latest Databricks runtime version: The issue may be related to a bug in the Databricks runtime. Try upgrading to the latest version of the runtime to see if the issue is resolved.

If none of the above steps resolve the issue, you can try contacting the Azure Databricks support team for further assistance.

View solution in original post

3 REPLIES 3

Anonymous
Not applicable

@Gustavo Amadoz Navarro​ :

The error message suggests that there was an unhandled error in the API call and the trace ID can be used to track down the root cause of the issue. The error could be related to the authentication or authorization to access the cloud storage or there could be an issue with the configuration of the storage account or the external location.

Here are some steps you can take to troubleshoot the issue:

  1. Check the trace ID: The trace ID mentioned in the error message can be used to find the exact error in the Databricks logs. You can search for this trace ID in the Databricks logs to find more information about the error.
  2. Check the access token: Make sure that the access token used to authenticate to the cloud storage is valid and has the required permissions to access the storage account.
  3. Check the external location configuration: Verify that the configuration of the external location is correct, including the storage account name, container name, and root path.
  4. Check the firewall settings: Ensure that the firewall settings for the storage account allow access from the Azure Databricks workspace.
  5. Check the network connectivity: Check the network connectivity between the Azure Databricks workspace and the cloud storage. Try accessing the storage account from a different network to see if the issue persists.
  6. Upgrade to the latest Databricks runtime version: The issue may be related to a bug in the Databricks runtime. Try upgrading to the latest version of the runtime to see if the issue is resolved.

If none of the above steps resolve the issue, you can try contacting the Azure Databricks support team for further assistance.

Anonymous
Not applicable

Hi @Gustavo Amadoz Navarro​ 

Hope all is well! Just wanted to check in if you were able to resolve your issue and would you be happy to share the solution or mark an answer as best? Else please let us know if you need more help. 

We'd love to hear from you.

Thanks!

Gustavo_Az
Contributor

I think I sould have something missconfigured, the way I solved the problem was to re-create the workspace and start from scratch, it was a small one for testing proposes.

Welcome to Databricks Community: Lets learn, network and celebrate together

Join our fast-growing data practitioner and expert community of 80K+ members, ready to discover, help and collaborate together while making meaningful connections. 

Click here to register and join today! 

Engage in exciting technical discussions, join a group with your peers and meet our Featured Members.