We needed to move to databricks-connect>13.x. Now I facing the issue that when I work with a nested dataframe of the structure```root|-- a: string (nullable = true)|-- b: array (nullable = true)| |-- element: struct (containsNull = true)| | |-- c: s...
With the newest version of databricks-connect, I cannot configure the extra jars I want to use. In the older version, I did that viaspark = SparkSession.builder.appName('DataFrame').\
config('spark.jars.packages','org.apache.spark:spark-avro_...
I install the newest version "databricks-connect==13.0.0". Now get the issue Command C:\Users\Y\AppData\Local\pypoetry\Cache\virtualenvs\X-py3.9\Lib\site-packages\pyspark\bin\spark-class2.cmd"" not found konnte nicht gefunden werden. Traceback...
Currently, I am facing an issue since the `databricks-connect` runtime on our cluster was updated to 10.4. Since then, I cannot load the jars for spark-avro anymore. By Running the following code from pyspark.sql import SparkSession
spark = SparkSe...
In addition here is the full stack trace23/12/07 14:51:56 ERROR SerializingExecutor: Exception while executing runnable grpc_shaded.io.grpc.internal.ServerImpl$JumpToApplicationThreadServerStreamListener$1MessagesAvailable@33dfd6ecgrpc_shaded.io.grpc...
with the newest Version the error changed to ```Traceback (most recent call last): File "C:\x\repositories\lf_backup_repo\snippets.py", line 4, in <module> spark = SparkSession.builder.getOrCreate() File "C:\Users\x\AppData\Local\pypoetry\Cache\...