When run sparkR.session()
I faced below error:
Spark package found in SPARK_HOME: /databricks/spark
Launching java with spark-submit command /databricks/spark/bin/spark-submit sparkr-shell /tmp/Rtmp5hnW8G/backend_porte9141208532d
Error: Could not find or load main class org.apache.spark.launcher.Main
/databricks/spark/bin/spark-class: line 101: CMD: bad array subscript
Error in sparkR.sparkContext(master, appName, sparkHome, sparkConfigMap, :
JVM is not ready after 10 seconds
When I checked the cluster log4j , I found I hit the Rbackend limit:
21/06/29 18:26:17 INFO RDriverLocal: 394. RDriverLocal.e9dee079-46f8-4108-b1ed-25fa02742efb: Exceeded maximum number of RBackends limit: 200