cancel
Showing results forย 
Search instead forย 
Did you mean:ย 
Data Engineering
cancel
Showing results forย 
Search instead forย 
Did you mean:ย 

Errors When Using R on Unity Catalog Clusters

Kayla
Contributor

We are running into errors when running workflows with multiple jobs using the same notebook/different parameters. They are reading from tables we still have in hive_metastore, there's no Unity Catalog tables or functionality referenced anywhere. We're getting this error: "m.databricks.unity.error.MissingCredentialScopeException: [UNITY_CREDENTIAL_SCOPE_MISSING_SCOPE] Missing Credential Scope. Failed to find Unity Credential Scope"

This happens consistently for 2 out of 3 notebooks, so every workflow run has 1 success and 2 failures. Running the same code on a cluster that is not Unity Catalog enabled runs fine, but we will need that capability going forward. This is on Databricks 13.3 LTS.


5 REPLIES 5

Kaniz
Community Manager
Community Manager

Hi @Kayla,

  • This may happen if your Unity Catalog is not configured properly. In that case, consider setting it up following the appropriate guidelines for your Databricks environment.
  • Check if any environment variables related to your workflow (especially those related to Unity Catalog) are unset or incorrectly set. 
  • If youโ€™re using Databricks-Connect, make sure youโ€™re using a compatible version. 
  • If your workflows involve reading from S3 or other AWS services, ensure that your AWS credentials (access_id, access_key, and session_token) are correct and have not expired.

Good luck with resolving the credential scope issue! ๐Ÿš€

Hi @Kaniz 
Thanks for the quick response.
To clarify- no Unity Catalog asset of any sort is referenced or involved. There's no environment variables in play. We're not using Databricks-Connect, we're running a Databricks notebook via a workflow. We're not using any AWS services. We're on GCP, and also not using any GCP services.

We have discovered that this works on non-Unity Catalog clusters, or if we run the notebooks sequentially instead of in parallel.
We're not clear why running these notebooks in parallel on a Unity Catalog cluster is giving "credential scope" errors when there's no reason for any credential scope to be needed.

mariusatkinson
New Contributor

Do you have row based access on the data you are trying to access?

We do not, no.

mariusatkinson
New Contributor

Ah, I suspected that it might have something to do with fine grained access control and an incompatability with R and UC when it's configured like in that way. Obvisouly if you don't, it's not that.

Welcome to Databricks Community: Lets learn, network and celebrate together

Join our fast-growing data practitioner and expert community of 80K+ members, ready to discover, help and collaborate together while making meaningful connections. 

Click here to register and join today! 

Engage in exciting technical discussions, join a group with your peers and meet our Featured Members.