Java SQL Driver Manager not working in Unity Catalog shared mode
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
06-21-2024 04:17 AM - edited 06-21-2024 04:52 AM
Hi All,
We are facing issue during establishing connection with Azure SQL server through JDBC to perform UPSERT operation into sql server. Please find the connection statement and exception received during run:
conn = spark._sc._jvm.java.sql.DriverManager.getConnection(url,propertis["username"],properties["password"])
statement = conn.createStaement(merge_query)
statement.execute()
statement.close()
conn.close()
But we are getting below error:
Following code throwing an error: py4j.security.Py4JSecurityException: Method public static java.sql.Connection java.sql.DriverManager.getConnection(java.lang.String) throws java.sql.SQLException is not whitelisted on class class java.sql.DriverManager
Kindly help.
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
06-23-2024 08:54 PM - edited 06-23-2024 09:05 PM
Hello @sensanjoy ,
Based on this stacktrace, the cluster used in this case looks as shared access mode with Unity Catalog enabled cluster to me.
Shared access mode clusters do not allow such connections.
This use case can be addressed by:
- Using a single user cluster Or
- Considering the usage of Lakehouse Query Federation (https://learn.microsoft.com/en-us/azure/databricks/query-federation/)
Regards
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
06-23-2024 11:48 PM - edited 06-23-2024 11:49 PM
Pretty sure you can use the Spark driver for SQL Server on Shared UC clusters.
More info here: https://learn.microsoft.com/en-us/sql/connect/spark/connector?view=sql-server-ver16
I do agree that you should rather use lakehouse federation, although you won't be able to do upserts in SQL this way. It can only do read operations.
Moving from Non-Isolation Shared to Shared usually comes with a lot of (legacy) code refactoring.
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
06-24-2024 12:42 AM - edited 06-24-2024 12:47 AM
Thanks @User16502773013 @jacovangelder
That is something interesting to know that Lakehouse Federation does not support UPSERT(merge into...)!!
@jacovangelder I think with above approach(link shared by you) only support "append" and "overwrite", but the requirement here is to perform UPSERT (execute some custom sql).
There is one more way that can be done and its through pyodbc connection after installing odbc driver(odbc criver 17 for sql server). Not sure is there any drawback compared to JDBC but we can run the same code on shared mode.
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
06-24-2024 02:49 AM
Just thinking out loud: perhaps you can use a stored procedure to do the upsert and initiate the stored procedure using something like pyodbc. Lakehouse federation for sure can not write. They are read only connections.
The shared access mode clusters are mostly limited to functionality inside of Databricks due to the security model they have.

