cancel
Showing results forย 
Search instead forย 
Did you mean:ย 
Data Engineering
Join discussions on data engineering best practices, architectures, and optimization strategies within the Databricks Community. Exchange insights and solutions with fellow data engineers.
cancel
Showing results forย 
Search instead forย 
Did you mean:ย 

Unity Credential Scope id not found in thread locals

kasiviss42
New Contributor

i am facing issue :- [UNITY_CREDENTIAL_SCOPE_MISSING_SCOPE] Missing Credential Scope. Unity Credential Scope id not found in thread locals.

Issue occurs:-

when we try to list files using dbutils.fs.ls

and also this occurs at times when we try to write or  read a parquet file 

 

This occurs 4 in 10 times a day. This occurs in prod as well .

 

 

No Proper fix has been Documented.

 

Can someone help here ? This is blocked since many days 

5 REPLIES 5

Alberto_Umana
Databricks Employee
Databricks Employee

Hi @kasiviss42,

Are you using any Scala code in your notebook?

The error [UNITY_CREDENTIAL_SCOPE_MISSING_SCOPE] Missing Credential Scope. Unity Credential Scope id not found in thread locals that you are encountering when using dbutils.fs.ls and while reading or writing parquet files is a known issue related to Unity Catalog in Databricks. This issue occurs because Unity Credential Scopes are stored as thread locals, and in certain scenarios, such as when using Scala thread pools, these scopes are not transferred to the new threads, leading to the error

Its python

VZLA
Databricks Employee
Databricks Employee

Hi @kasiviss42, Thank you for sharing the error details!

To provide better guidance, could you clarify the following details about your environment and use case?

  1. Cluster Configuration:

    • Are you using a single-user cluster or a shared/multi-user cluster? If single-user, have you set spark.databricks.unityCatalog.singleUserAutoGlobalScope to true?
  2. Workload Details:

    • What type of operation or workload are you running when this error occurs (e.g., a Spark job, Delta Lake operation, or background process)?
    • Are you using thread pools, asynchronous tasks, or submitting jobs programmatically?
  3. Specific Context:

    • Does this issue occur consistently or intermittently?
    • Have you noticed if certain operations or configurations (e.g., checkpointing, writing to S3, or specific thread usage) trigger the error?
  4. Setup and Logs:

    • Are you running this on a free trial or a production workspace?
    • Can you share the specific stack trace or log details (excluding sensitive information) to help identify where the issue arises?

1. Cluster Configuration

Policy :- Shared Compute
Access Mode :- Shared
Hence we haven't set singleUserAutoGlobalScope to True.
 
2
WorkLoad Details.
Executing a notebook written in Pyspark .
i. We are simply writing dataframe to parquet and read it back.(For kind of checkpointing)
ii. And using dbutils.fs.ls through volumes 
Above two cases error is thrown intermittently 
We are not using asynchronous threads or pools
 
3.
Error is frequently occurring i.e. 40% of the times it fails . 60% it succeeds. We need a code to be 100% succeeding
We have noticed failures when dbutils.fs.ls is being executed in the code. as well as when we write dataframe to parquet or read the dataframe written in parquet back
 
4. 
We are running on a prod workspace.
i. Below error occurs when a parquet file is written or read
Below is the underlying error caused by :- (com.databricks.backend.daemon.data.common.InvalidMountException)
Caused by: com.databricks.unity.error.MissingCredentialScopeException: [UNITY_CREDENTIAL_SCOPE_MISSING_SCOPE] Missing Credential Scope. Unity Credential Scope id not found in thread locals.. SQLSTATE: XXKUC
 
ii. Below error occurs when dbutils.fs.ls is being executed.
Exception: An error occurred while calling o542.ls.
: com.databricks.backend.daemon.data.common.InvalidMountException:
 
 
Strange thing here is both the code is same notebook .
One time it fails at parquet read but succeeds dbutils.fs.ls
other time it fails at dbutils.fs.ls itself  .
 
 
Please suggest 

VZLA
Databricks Employee
Databricks Employee

@kasiviss42 thanks for sharing these details. 

We can potentially get to the root cause by enabling the traceLogging for UC, but the communication will not be efficient through this channel. Could you please raise a Support Ticket? We'd like to help you address this problem or at the very least find a workaround.

Connect with Databricks Users in Your Area

Join a Regional User Group to connect with local Databricks users. Events will be happening in your city, and you wonโ€™t want to miss the chance to attend and share knowledge.

If there isnโ€™t a group near you, start one and help create a community that brings people together.

Request a New Group