Hi there!
I believe I might have identified a bug with DBR 15.4 LTS Beta. The basic task of saving data to a delta table, as well as an even more basic operation of saving a file to cloud storage, is failing on 15.4, but working perfectly fine on 15.3. Some other details:  
- it seems like this is only happening on a single node, single user cluster (still has the Unity Catalog tag)
 - the Storage Credential to this entire storage account is configured using a system-assigned managed identity, per Databricks recommendation here
 - the access connector identity has been granted the following roles in the storage account (again, following the article): Storage Account Contributor, Storage Blob Data Contributor, Storage Queue Data Contributor
 - running the exact same command, on the exact same cluster fails on 15.4 but succeeds with 15.3
 - I'm running the notebook as my personal user, and I have ALL PRIVILEGES on this dev_sandbox catalog
 
the command - a simple saveAsTable:
 
table_path = "dev_sandbox.files_metadata.bntestsave"
df.write.format("delta").mode("overwrite").saveAsTable(table_path)
 
Here is the error message, with some identifiable info redacted: 
 
Failed to save file to /Volumes/[REDACTED]. Error: An error occurred while calling o1607.save.
: org.apache.spark.SparkException: Exception thrown in awaitResult: Failed with shaded.databricks.azurebfs.org.apache.hadoop.fs.azurebfs.contracts.exceptions.SASTokenProviderException while processing file/directory :[[REDACTED]/_committed_4591969627371730351] in method:[Failed to acquire a SAS token for write on [REDACTED]/_committed_4591969627371730351 due to com.databricks.unity.error.MissingCredentialScopeException: [UNITY_CREDENTIAL_SCOPE_MISSING_SCOPE] Missing Credential Scope. Failed to find Unity Credential Scope.. SQLSTATE: XXKUC]