We are running into errors when running workflows with multiple jobs using the same notebook/different parameters. They are reading from tables we still have in hive_metastore, there's no Unity Catalog tables or functionality referenced anywhere. We're getting this error: "m.databricks.unity.error.MissingCredentialScopeException: [UNITY_CREDENTIAL_SCOPE_MISSING_SCOPE] Missing Credential Scope. Failed to find Unity Credential Scope"
This happens consistently for 2 out of 3 notebooks, so every workflow run has 1 success and 2 failures. Running the same code on a cluster that is not Unity Catalog enabled runs fine, but we will need that capability going forward. This is on Databricks 13.3 LTS.