@Giovanni Allegri :
The configuration you have provided is for registering the Sedona SQL extensions with Spark SQL. However, to register Sedona types and functions with PySpark, you need to use a different configuration.
You can add the following configuration to the Spark cluster configuration to enable automatic registration of Sedona types and functions with PySpark:
spark.extraListeners org.apache.sedona.core.serde.SedonaSQLRegistrator
This will enable automatic registration of Sedona types and functions when a PySpark session is created. Alternatively, you can also register Sedona types and functions explicitly in your PySpark code using the SedonaRegistrator.registerAll(spark) method. However, this would require you to call this method every time you create a new PySpark session.
I hope this helps!