cancel
Showing results forย 
Search instead forย 
Did you mean:ย 
Data Governance
cancel
Showing results forย 
Search instead forย 
Did you mean:ย 

Metastore creation - Azure Databricks - Internal Server Error

YakupOeztuerk
New Contributor III

Recently it has not been possible to create a metastore. The error "Internal Server Error" appears. Despite authorization of the Managed Identities as Storage Blob Data Contributor. Apparently the error has started showing up on a few more recently, so I'm creating a new ticket.

1 ACCEPTED SOLUTION

Accepted Solutions

arpit
Contributor III
Contributor III

We confirm that this was some regression on Databricks and we have rolled out the fix for it. You can try to test again.

View solution in original post

10 REPLIES 10

Kaniz
Community Manager
Community Manager

Hi @YakupOeztuerk , have you created a support ticket with us?

 

Kaniz
Community Manager
Community Manager

Hi @YakupOeztuerk, This issue could be due to a variety of factors.

Here are some steps you can take to troubleshoot and resolve the problem:

  1. Check Databricks Status:

    • Before troubleshooting the issue, it's a good idea to check the Databricks status page or community discussions to see if any ongoing service disruptions could be causing the error.
  2. Error Message Details:

    • Try to gather more information about the specific error message. Sometimes, the "Internal Server Error" can be accompanied by additional details that provide insight into the root cause.
  3. Azure Managed Identity Configuration:

    • Double-check the configuration of your Azure Managed Identity. Ensure it's correctly set up and has the necessary permissions to access the storage account and create the metastore.
  4. Storage Account Permissions:

    • Ensure that the Managed Identity associated with Databricks has been granted the appropriate permissions on the storage account. It should have both "Storage Blob Data Contributor" and "Storage Blob Data Owner" roles to perform necessary operations.
  5. Network Connectivity:

    • Ensure there are no network connectivity issues between Databricks and the storage account. Firewall rules or network restrictions could potentially cause communication problems.
  6. Retry and Monitor:

    • Sometimes, transient issues can cause temporary failures. Retry the operation after some time and monitor whether the problem persists.
  7. Databricks Support:

    • If the issue continues to persist, consider filing a Databricks support ticket. They can provide specific assistance and guidance tailored to your setup.
  8. Check Azure Updates:

    • Sometimes, updates or changes in the Azure environment can impact functionality. Check for any recent updates or changes relevant to your setup.
  9. Review Documentation and Release Notes:

    • Review the official Databricks documentation and release notes for any recent changes or updates related to creating megastores or Azure Managed Identity integration. This might shed light on any new requirements or changes.
  10. Testing in Different Environments:

    • If possible, replicate the issue in a different environment or account. This can help determine whether the problem is specific to your current setup.

Remember that troubleshooting complex issues like this may require a systematic approach. Document the steps you've taken and any error messages you encounter, as this information will be valuable if you decide to seek assistance from Databricks support or the community.

m997al
Contributor

Hi @Kaniz ... we are having the same problem.  In your instructions, you say to give the Databricks access connector the "Storage Blob Data Contributor" AND the "Storage Blob Data Owner" roles on the Storage Account.  Definitely did not see the "Storage Blob Data Owner" role specified in the instructions.  Will try and see if that was the issue.

Hi @Kaniz... no luck.  Still the same error of "Access Validation Failed" with a sub-error of "Internal server error" when we try to create the unity catalog metastore from the Databricks account console.  To the best we can tell, everything else works.  Databricks is currently working for us, this is just a protoype unity catalog that we are setting up to create some new workspaces for.  But no luck on the "create metastore" step.

arpit
Contributor III
Contributor III

We have identified this issue and are reviewing it

youssefmrini
Honored Contributor III
Honored Contributor III

Based on the error message you provided, it seems like there might be an issue with the Databricks service itself. Since this is an internal server error, itโ€™s likely something that only the Databricks support team can resolve.

I recommend contacting the Databricks support team directly. You can submit a ticket through the Databricks support portal or email them at support@databricks.com. Be sure to include as much detail as possible about the issue, including any relevant error messages or logs.

In the meantime, you can try re-creating the metastore at a later time to see if the issue has been resolved.

arpit
Contributor III
Contributor III

We confirm that this was some regression on Databricks and we have rolled out the fix for it. You can try to test again.

Hi @arpit - that is great news.  

As an aside... in the instructions, of course a Global AAD Admin needs to be the first to the Databricks account console and create the metastore...but... does it matter if it is someone else who sets up the Azure Storage account for the metastore (even with all the right permissions and ownership), or does the Azure Storage account creation have to also be done by the Global Admin?

I'm guessing the answer is "no", that someone with enough permissions over the storage account and Databricks access connector can do all that... but we weren't quite positive everything that is being tried by Databricks at the step of "Create metastore" (in case it is something beyond creating data in the storage container)...

Thanks!

Just following up - confirmed that the fix from Databricks solved our problem.  Also, confirmed that the Global Admin does NOT have to be the person to create the Azure Storage Account for the metastore, or the Databricks Access Connector.  The Global Admin does need to be the first to create the metastore within the Databricks account console, but then can assign Databricks account console access to others.

So it all worked.  Thank you!

comcrazy
New Contributor II

Hi all, I got a similar error as described in this discussion. Internal server error (500) on https://accounts.azuredatabricks.net/api/2.0/accounts/4bfcxxxx-bfea-483f-b59b-c2b2e2xxxx4/metastores...

with following response

{
"error_code": "INTERNAL_ERROR",
"message": "",
"details": [
{
"@type": "type.googleapis.com/google.rpc.RequestInfo",
"request_id": "cc16c5cc-13dc-49a0-8473-8685bfa73a34",
"serving_data": ""
}
]
}

Can advise how to solve this? 

 

Thank you.

 

Welcome to Databricks Community: Lets learn, network and celebrate together

Join our fast-growing data practitioner and expert community of 80K+ members, ready to discover, help and collaborate together while making meaningful connections. 

Click here to register and join today! 

Engage in exciting technical discussions, join a group with your peers and meet our Featured Members.