cancel
Showing results for 
Search instead for 
Did you mean: 
Data Governance
Join discussions on data governance practices, compliance, and security within the Databricks Community. Exchange strategies and insights to ensure data integrity and regulatory compliance.
cancel
Showing results for 
Search instead for 
Did you mean: 

SparkException: There is no Credential Scope.

paniz_asghari
New Contributor

Hi 🙂

I am new to databricks and trying to connect to Rstudio Server from my all-purpose compute cluster.

Here are the cluster configuration:

Policy: Personal Compute

Access mode: Single user

Databricks run time version: 

13.2 ML (includes Apache Spark 3.4.0, Scala 2.12)
 
Following the instruction here, I am trying to run the codes both with sparlyr and sparkR.

Sparklyr

> library(sparklyr)
> sc <- spark_connect(method = "databricks")
However, I receive the below error:
Error in value[[3L]](cond) : 
  Failed to start sparklyr backend: java.util.concurrent.ExecutionException: org.apache.spark.SparkException: There is no Credential Scope. 
	at com.google.common.util.concurrent.AbstractFuture$Sync.getValue(AbstractFuture.java:299)
	at com.google.common.util.concurrent.AbstractFuture$Sync.get(AbstractFuture.java:286)
	at com.google.common.util.concurrent.AbstractFuture.get(AbstractFuture.java:116)
	at com.google.common.util.concurrent.Uninterruptibles.getUninterruptibly(Uninterruptibles.java:135)
	at com.google.common.cache.LocalCache$Segment.getAndRecordStats(LocalCache.java:2344)
	at com.google.common.cache.LocalCache$Segment.loadSync(LocalCache.java:2316)
	at com.google.common.cache.LocalCache$Segment.lockedGetOrLoad(LocalCache.java:2278)
	at com.google.common.cache.LocalCache$Segment.get(LocalCache.java:2193)
	at com.google.common.cache.LocalCache.get(LocalCache.java:3932)
	at com.google.common.cache.LocalCache.getOrLoad(LocalCache.java:3936)
	at com.google.common.cache.LocIn addition: Warning messages:1: In file.create(to[okay]) :  cannot create file '/usr/local/lib/R/site-library/sparklyr/java//sparklyr-2.2-2.11.jar', reason 'Permission denied'2: In file.create(to[okay]) :  cannot create file '/usr/local/lib/R/site-library/sparklyr/java//sparklyr-2.1-2.11.jar', reason 'Permission denied'

sparkR

> library(SparkR) > sparkR.session() Java ref type org.apache.spark.sql.SparkSession id 1 > df <- SparkR::sql("SELECT * FROM default.diamonds LIMIT 2")

Error traceback

Error in handleErrors(returnStatus, conn) : 
  org.apache.spark.sql.AnalysisException: There is no Credential Scope. ; line 1 pos 14
	at org.apache.spark.sql.catalyst.analysis.package$AnalysisErrorAt.failAnalysis(package.scala:69)
	at org.apache.spark.sql.execution.datasources.ResolveSQLOnFile$$anonfun$apply$1.applyOrElse(rules.scala:172)
	at org.apache.spark.sql.execution.datasources.ResolveSQLOnFile$$anonfun$apply$1.applyOrElse(rules.scala:94)
	at org.apache.spark.sql.catalyst.plans.logical.AnalysisHelper.$anonfun$resolveOperatorsDownWithPruning$2(AnalysisHelper.scala:219)
	at org.apache.spark.sql.catalyst.trees.CurrentOrigin$.withOrigin(TreeNode.scala:106)
	at org.apache.spark.sql.catalyst.plans.logical.AnalysisHelper.$anonfun$resolveOperatorsDownWithPruning$1(AnalysisHelper.scala:219)
	at org.apache.spark.sql.catalyst.plans.logical.AnalysisHelper$.allowInvokingTransformsInAnalyzer(AnalysisHelper.scala:372)
	at org.apache.spark.sql.catalyst.plans.logical.AnalysisHelper.resolveOperatorsDownWithPruning(AnalysisHelper.scal

 Can someone help me?

1 REPLY 1

kunalmishra9
New Contributor III

Running into this issue as well. Let me know if you found a resolution, @paniz_asghari

 
Join 100K+ Data Experts: Register Now & Grow with Us!

Excited to expand your horizons with us? Click here to Register and begin your journey to success!

Already a member? Login and join your local regional user group! If there isn’t one near you, fill out this form and we’ll create one for you to join!