cancel
Showing results for 
Search instead for 
Did you mean: 
Data Engineering
Join discussions on data engineering best practices, architectures, and optimization strategies within the Databricks Community. Exchange insights and solutions with fellow data engineers.
cancel
Showing results for 
Search instead for 
Did you mean: 

SparkR session failed to initialize

User16752239289
Valued Contributor

When run sparkR.session()

I faced below error:

Spark package found in SPARK_HOME: /databricks/spark
 
Launching java with spark-submit command /databricks/spark/bin/spark-submit sparkr-shell /tmp/Rtmp5hnW8G/backend_porte9141208532d
 
Error: Could not find or load main class org.apache.spark.launcher.Main
 
/databricks/spark/bin/spark-class: line 101: CMD: bad array subscript
 
Error in sparkR.sparkContext(master, appName, sparkHome, sparkConfigMap, :
 
JVM is not ready after 10 seconds

When I checked the cluster log4j , I found I hit the Rbackend limit:

21/06/29 18:26:17 INFO RDriverLocal: 394. RDriverLocal.e9dee079-46f8-4108-b1ed-25fa02742efb: Exceeded maximum number of RBackends limit: 200

1 ACCEPTED SOLUTION

Accepted Solutions

User16752239289
Valued Contributor

This is due to the when users run their R scripts on Rstudio, the R session is not shut down gracefully.

Databricks is working on handle the R session better and removed the limit.

As a workaround, you can create and run below init script to increase the limit:

%scala
val initScriptContent = s"""
 |#!/bin/bash
 |cat > /databricks/common/conf/rbackend_limit.conf << EOL
 |{
 | databricks.daemon.driver.maxNumRBackendsPerDriver = <value>
 |}
 |EOL
""".stripMargin
dbutils.fs.put("dbfs:/<path>/set_rbackend.sh",initScriptContent, true)

View solution in original post

1 REPLY 1

User16752239289
Valued Contributor

This is due to the when users run their R scripts on Rstudio, the R session is not shut down gracefully.

Databricks is working on handle the R session better and removed the limit.

As a workaround, you can create and run below init script to increase the limit:

%scala
val initScriptContent = s"""
 |#!/bin/bash
 |cat > /databricks/common/conf/rbackend_limit.conf << EOL
 |{
 | databricks.daemon.driver.maxNumRBackendsPerDriver = <value>
 |}
 |EOL
""".stripMargin
dbutils.fs.put("dbfs:/<path>/set_rbackend.sh",initScriptContent, true)

Connect with Databricks Users in Your Area

Join a Regional User Group to connect with local Databricks users. Events will be happening in your city, and you won’t want to miss the chance to attend and share knowledge.

If there isn’t a group near you, start one and help create a community that brings people together.

Request a New Group