cancel
Showing results for 
Search instead for 
Did you mean: 
Administration & Architecture
Explore discussions on Databricks administration, deployment strategies, and architectural best practices. Connect with administrators and architects to optimize your Databricks environment for performance, scalability, and security.
cancel
Showing results for 
Search instead for 
Did you mean: 

How to disable spark connect in the databricks compute?

chinmay0924
New Contributor III

I want to be able to access the RDD methods of a Dataframe, but it seems that this is not supported in spark connect. I have been trying to disable spark connect in the spark config using,

spark.databricks.service.server.enabled false

but when I check the type of spark it is still `pyspark.sql.connect.session.SparkSession`. How do I disable spark connect?

4 REPLIES 4

Bareaj
New Contributor II

If you only want to disable it, set spark.databricks.service.server.enabled to false. However, you might not be able to run your code afterward

Bareaj
New Contributor II

Maybe you can do something like:

from pyspark.sql import SparkSession
spark = SparkSession.builder.getOrCreate()
if spark.__class__.__module__ == "pyspark.sql.connect.session":
    code when it is enable
else:
    code when not

chinmay0924
New Contributor III

I want to disable and I tried setting spark.databricks.service.server.enabled to false in the spark config while creating the compute, but when I run the notebook with this compute and I do 

type(spark), I get pyspark.sql.connect.session.SparkSession.

Bareaj
New Contributor II

I have found that when the cluster is shared, it automatically uses that type of session, and in that case, I have not been able to disable it. I don't know if this is your situation. I have avoided some problems that I had with the previous clause.

Join Us as a Local Community Builder!

Passionate about hosting events and connecting people? Help us grow a vibrant local community—sign up today to get started!

Sign Up Now