cancel
Showing results for 
Search instead for 
Did you mean: 
Data Engineering
Join discussions on data engineering best practices, architectures, and optimization strategies within the Databricks Community. Exchange insights and solutions with fellow data engineers.
cancel
Showing results for 
Search instead for 
Did you mean: 

databricks-connect version 13: spark-class2.cmd not found

Lazloo
New Contributor III

I install the newest version "databricks-connect==13.0.0". Now get the issue

   Command C:\Users\Y\AppData\Local\pypoetry\Cache\virtualenvs\X-py3.9\Lib\site-packages\pyspark\bin\spark-class2.cmd"" not found

   konnte nicht gefunden werden.

   Traceback (most recent call last):

     File "C:\X\repositories\schema-integration-customer\tmp_run_builder.py", line 37, in <module>

       spark = get_spark()

     File "C:\X\repositories\data-common\X\da\common\_library\spark.py", line 60, in get_spark

       return builder.getOrCreate()

     File "C:\Users\Y\AppData\Local\pypoetry\Cache\virtualenvs\X-schema-integration-customer-hjO9aLIy-py3.9\lib\site-packages\pyspark\sql\session.py", line 479, in getOrCreate

       else SparkContext.getOrCreate(sparkConf)

     File "C:\Users\Y\AppData\Local\pypoetry\Cache\virtualenvs\X-schema-integration-customer-hjO9aLIy-py3.9\lib\site-packages\pyspark\context.py", line 560, in getOrCreate

       SparkContext(conf=conf or SparkConf())

     File "C:\Users\y\AppData\Local\pypoetry\Cache\virtualenvs\x-schema-integration-customer-hjO9aLIy-py3.9\lib\site-packages\pyspark\context.py", line 202, in __init__

       SparkContext._ensure_initialized(self, gateway=gateway, conf=conf)

     File "C:\Users\y\AppData\Local\pypoetry\Cache\virtualenvs\x-schema-integration-customer-hjO9aLIy-py3.9\lib\site-packages\pyspark\context.py", line 480, in _ensure_initialized

       SparkContext._gateway = gateway or launch_gateway(conf)

     File "C:\Users\y\AppData\Local\pypoetry\Cache\virtualenvs\x-schema-integration-customer-hjO9aLIy-py3.9\lib\site-packages\pyspark\java_gateway.py", line 106, in launch_gateway

       raise RuntimeError("Java gateway process exited before sending its port number")

   RuntimeError: Java gateway process exited before sending its port number

   Process finished with exit code 1

I use Windows and for the version "databricks-connect==11.3.10" everything run smooth

6 REPLIES 6

Anonymous
Not applicable

@Lazloo XP​ :

The error message you received indicates that there is a problem with launching the Java gateway process. This is typically caused by a misconfiguration in the environment variables that point to the location of the Spark and Java executables.

To resolve this issue, you can try the following steps:

  1. Verify that you have installed the correct version of Java and Spark that is compatible with Databricks Connect 13.0.0.
  2. Check that the environment variables JAVA_HOME and SPARK_HOME are set correctly and point to the correct directories where Java and Spark are installed.
  3. Make sure that the bin directories of both Java and Spark are included in the PATH environment variable.
  4. Ensure that there are no conflicting versions of Java or Spark installed on your system that may be causing conflicts.

If you are still encountering issues after verifying these steps, you may want to try reinstalling Databricks Connect 13.0.0 or rolling back to version 11.3.10 if it was working previously.

Lazloo
New Contributor III

Hey @Suteja Kanuri​ ,

I have no issue with version 11.3.10.. Therefore I believe that my environment variables are set correctly.

Lazloo
New Contributor III

with the newest Version the error changed to

```

Traceback (most recent call last):

File "C:\x\repositories\lf_backup_repo\snippets.py", line 4, in <module>

spark = SparkSession.builder.getOrCreate()

File "C:\Users\x\AppData\Local\pypoetry\Cache\virtualenvs\lf-privat-eCptrNhE-py3.8\lib\site-packages\pyspark\sql\session.py", line 469, in getOrCreate

raise RuntimeError(

RuntimeError: Only remote Spark sessions using Databricks Connect are supported. Could not find connection parameters to start a Spark remote session.

```

Hardy
New Contributor III

I am also facing similar issue with databricks-connect 13.

Getting RuntimeError: Only remote Spark sessions using Databricks Connect are supported. Could not find connection parameters to start a Spark remote session.

  

Databricks-connect - 13

DBR - 13

Python: 3.10.11

HD
New Contributor II

I get the same error. Please help with any hints.

Susumu_Asaga
New Contributor II

Use this code:

from databricks.connect import DatabricksSession

spark = DatabricksSession.builder.getOrCreate()

 

Connect with Databricks Users in Your Area

Join a Regional User Group to connect with local Databricks users. Events will be happening in your city, and you won’t want to miss the chance to attend and share knowledge.

If there isn’t a group near you, start one and help create a community that brings people together.

Request a New Group