11-11-2021 07:17 PM
It would be possible to activate a custom extensions like Sedona (https://sedona.apache.org/download/databricks/ ) in SQL Endopoints?
Example error:
java.lang.ClassNotFoundException: org.apache.spark.sql.sedona_sql.UDT.GeometryUDT at org.apache.spark.sql.hive.thriftserver.SparkExecuteStatementOperation.org$apache$spark$sql$hive$thrift'. (35) (SQLExecDirectW)
07-06-2022 01:43 PM
@Kaniz Fatma does this work for SQL Endpoints as well as clusters?
I am able to install this extension using the Sedona jars with no problems on my compute cluster, but it seems to me that the UI doesn't have an option for me to do this for a SQL Warehouse. I am using azure, so I'm not sure if that makes it different.
For example, when I click into my compute cluster, I see the tab shown at the bottom, while on my "starter endpoint" I don't see an analagous option.
Thanks in advance!
07-06-2022 01:45 PM
07-06-2022 01:46 PM
11-20-2023 12:03 AM
Having the same issue. Working with sedona in a cluster is no problem. However, when using an SQL endpoint the following error occurs:
org.apache.spark.sql.sedona_sql.UDT.GeometryUDT
Im working in Azure Databricks (latest version)
Wondering if anyone has a solutution or workaround.
12-15-2023 04:39 AM
Same here.
04-04-2024 03:29 PM
@Retired_mod any updates here?
04-09-2024 01:49 AM
same hier on Azure Databricks.
04-10-2024 09:42 AM
@Retired_mod What is the right way to add custom spark extension to sql warehouse clusters?
Passionate about hosting events and connecting people? Help us grow a vibrant local community—sign up today to get started!
Sign Up Now