โ11-11-2021 07:17 PM
It would be possible to activate a custom extensions like Sedona (https://sedona.apache.org/download/databricks/ ) in SQL Endopoints?
Example error:
java.lang.ClassNotFoundException: org.apache.spark.sql.sedona_sql.UDT.GeometryUDT at org.apache.spark.sql.hive.thriftserver.SparkExecuteStatementOperation.org$apache$spark$sql$hive$thrift'. (35) (SQLExecDirectW)
โ05-21-2022 11:04 AM
Hi @JUAN MENDEZโ , You just need to install the Sedona jars and Sedona Python on Databricks using Databricks' default web UI. Then everything will work.
Please have a look at the documentation.
โ07-06-2022 01:43 PM
@Kaniz Fatmaโ does this work for SQL Endpoints as well as clusters?
I am able to install this extension using the Sedona jars with no problems on my compute cluster, but it seems to me that the UI doesn't have an option for me to do this for a SQL Warehouse. I am using azure, so I'm not sure if that makes it different.
For example, when I click into my compute cluster, I see the tab shown at the bottom, while on my "starter endpoint" I don't see an analagous option.
Thanks in advance!
โ07-06-2022 01:45 PM
โ07-06-2022 01:46 PM
โ11-20-2023 12:03 AM
Having the same issue. Working with sedona in a cluster is no problem. However, when using an SQL endpoint the following error occurs:
org.apache.spark.sql.sedona_sql.UDT.GeometryUDT
Im working in Azure Databricks (latest version)
Wondering if anyone has a solutution or workaround.
โ12-15-2023 04:39 AM
Same here.
โ04-04-2024 03:29 PM
@Kaniz_Fatma any updates here?
โ04-09-2024 01:49 AM
same hier on Azure Databricks.
โ04-10-2024 09:42 AM
@Kaniz_Fatma What is the right way to add custom spark extension to sql warehouse clusters?
Join a Regional User Group to connect with local Databricks users. Events will be happening in your city, and you wonโt want to miss the chance to attend and share knowledge.
If there isnโt a group near you, start one and help create a community that brings people together.
Request a New Group