โ12-13-2022 08:48 AM
If I start a RStudio Server with in cluster init script as described here in a Unity Catalog Cluster the sparklyr connection fails with an error about a missing Credential Scope.=LI tried it both in 11.3LTS and 12.0 Beta. I tried it only in a Personal Single Node Cluster. I did not try it out with a remote RStudio (via databricks connect).
This looks like a major incompatibility?
Something similar happens when I open 2 identical notebooks each with the following cell:
library(sparklyr)
library(tidyverse)
sc <- spark_connect(method="databricks")
sdf <- sc %>% sdf_sql("SELECT * FROM samples.nyctaxi.trips")
sdf
If you run it in the first notebook you get the desired outcome.
If you then run it on the same cluster (without restarting the cluster) you get the following error:
Error : org.apache.spark.SparkException: Missing Credential Scope.
at com.databricks.unity.UCSDriver$Manager.$anonfun$scope$1(UCSDriver.scala:103)
at scala.Option.getOrElse(Option.scala:189)
at com.databricks.unity.UCSDriver$Manager.scope(UCSDriver.scala:103)
at com.databricks.unity.UCSDriver$Manager.currentScope(UCSDriver.scala:97)
at com.databricks.unity.UnityCredentialScope$.currentScope(UnityCredentialScope.scala:100)
at com.databricks.unity.UnityCredentialScope$.getSAMRegistry(UnityCredentialScope.scala:120)
at com.databricks.unity.SAMRegistry$.getSAMOpt(SAMRegistry.scala:346)
at com.databricks.unity.CredentialScopeSQLHelper$.registerPathIfMissing(CredentialScopeSQLHelper.scala:204)
at com.databricks.sql.transaction.tahoe.DeltaLog$.apply(DeltaLog.scala:853)
at com.databricks.sql.transaction.tahoe.DeltaLog$.apply(DeltaLog.scala:774)
at com.databricks.sql.transaction.tahoe.DeltaLog$.apply(DeltaLog.scala:754)
at com.databricks.sql.transaction.tahoe.DeltaLog$.forTable(DeltaLog.scala:701)
at com.databricks.sql.transaction.tahoe.DeltaLog$.$anonfun$forTableWithSnapshot$1(DeltaLog.scala:780)
at com.databricks.sql.transaction.tahoe.DeltaLog$.withFreshSnapshot(DeltaLog.scala:806)
at com.databricks.sql.transaction.tahoe.DeltaLog$.forTableWithSnapshot(DeltaLog.scala:780)
at com.databricks.sql.managedcatalog.SampleTable.readSchema(SampleTables.scala:109)
at com.databricks.sql.managedcatalog.ManagedCatalogSessionCatalog.$anonfun$getSampleTableMetadata$1(ManagedCatalogSessionCatalog.scala:955)
at scala.Option.map(Option.scala:230)
at com.databricks.sql.managedcatalog.ManagedCatalogSessionCatalog.getSampleTableMetadata(M[...]
This happens:
It does not happen in SparkR.
There is already a StackOverflow Question about this: https://stackoverflow.com/questions/74575249/sparklyr-multiple-databricks-notebooks-second-connectio...
โ12-14-2022 01:18 PM
Hi @Kaniz Fatmaโ! Maybe you can point us into the right direction?
โ02-15-2023 06:10 AM
Hi @Philipp Thomannโ, Sure!
Let me get back to you asap.
โ02-24-2023 08:02 AM
Hi @Philipp Thomannโ, You must set up the AWS credentials in the RStudio server.
The error about a missing Credential Scope is related to AWS credentials and is likely caused by the fact that the RStudio server cannot access the AWS credentials required for Spark to access S3.
When starting an RStudio server in a Unity Catalog Cluster, the AWS credentials are not automatically propagated to the RStudio server.
To resolve this issue, you must set up the AWS credentials in the RStudio server.
โ02-28-2023 11:19 AM
Thanks @Kaniz Fatmaโ for your answer!
Actually, we have the problem on Azure Databricks. We did not test it on AWS yet. Does that make a difference?
Also - as written in the Question - the problem happens (actually more pressing) directly in the Databricks Notebooks themselves, if a second Databricks Notebook using R is connecting to the same Cluster. This happens both in interactivice clusters as well as when several steps in a Workflow use R Notebooks with Unity Catalog.
Could you give us some pointer (in the documentation, source code ?), what AWS/Azure Credentials you mean and how we should set them up?
Best, Philipp
โ09-26-2023 04:42 PM
Have run into this issue as well. Let me know if there was any resolution
Join a Regional User Group to connect with local Databricks users. Events will be happening in your city, and you wonโt want to miss the chance to attend and share knowledge.
If there isnโt a group near you, start one and help create a community that brings people together.
Request a New Group