Spark context not implemented Error when using Databricks connect
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
06-07-2024 02:23 AM
I am developing an application using databricks connect and when I try to use VectorAssembler I get the Error sc is not none Assertion Error. is there a workaround for this ?
- Labels:
-
Automl
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
06-07-2024 06:35 AM
@MightyMasdo could you please share the screenshot of the error along with the command?
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
01-28-2025 12:02 AM
I have exactly the same problem.
The error is in the line 84 of the file pyspark/ml/wrapper.py.
assert sc is not None
I create spark session with databricks connect as the following:
The call stack is the following:
My file: VectorAssembler(inputCols=..., outputCol=...)
pyspark/__init__.py, line 120, in wrapper
return func(self, **kwargs)
pyspark/ml/feature.py, line 5364, in __init__
self._java_obj = self._new_java_obj("org.apache.spark.ml.feature.VectorAssembler", self.uid)
pyspark/ml/wrapper.py, line 84, in _new_java_obj
assert sc is not None
pyspark version is 3.5.0
databricks-connect version is 15.4.4

