I have exactly the same problem.
The error is in the line 84 of the file pyspark/ml/wrapper.py.
assert sc is not None
I create spark session with databricks connect as the following:
from databricks.connect import DatabricksSession
spark = DatabricksSession.builder.remote().getOrCreate()
The call stack is the following:
My file: VectorAssembler(inputCols=..., outputCol=...)
pyspark/__init__.py, line 120, in wrapper
return func(self, **kwargs)
pyspark/ml/feature.py, line 5364, in __init__
self._java_obj = self._new_java_obj("org.apache.spark.ml.feature.VectorAssembler", self.uid)
pyspark/ml/wrapper.py, line 84, in _new_java_obj
assert sc is not None
pyspark version is 3.5.0
databricks-connect version is 15.4.4