cancel
Showing results forย 
Search instead forย 
Did you mean:ย 
Machine Learning
Dive into the world of machine learning on the Databricks platform. Explore discussions on algorithms, model training, deployment, and more. Connect with ML enthusiasts and experts.
cancel
Showing results forย 
Search instead forย 
Did you mean:ย 

Spark context not implemented Error when using Databricks connect

MightyMasdo
New Contributor II

I am developing an application using databricks connect and when I try to use VectorAssembler I get the Error sc is not none Assertion Error. is there a workaround for this ?

2 REPLIES 2

Yeshwanth
Databricks Employee
Databricks Employee

@MightyMasdo could you please share the screenshot of the error along with the command?

ลukasz1
New Contributor II

I have exactly the same problem.

The error is in the line 84 of the file pyspark/ml/wrapper.py.
assert sc is not None

I create spark session with databricks connect as the following:

from databricks.connect import DatabricksSession
spark = DatabricksSession.builder.remote().getOrCreate()
 

The call stack is the following:

My file: VectorAssembler(inputCols=..., outputCol=...)

pyspark/__init__.py, line 120, in wrapper
return func(self, **kwargs)

pyspark/ml/feature.py, line 5364, in __init__
self._java_obj = self._new_java_obj("org.apache.spark.ml.feature.VectorAssembler", self.uid)

pyspark/ml/wrapper.py, line 84, in _new_java_obj
assert sc is not None

pyspark version is 3.5.0
databricks-connect version is 15.4.4

Join Us as a Local Community Builder!

Passionate about hosting events and connecting people? Help us grow a vibrant local communityโ€”sign up today to get started!

Sign Up Now