cancel
Showing results for 
Search instead for 
Did you mean: 
Administration & Architecture
cancel
Showing results for 
Search instead for 
Did you mean: 

Native service principal support in JDBC/ODBC drivers

harripy
New Contributor III

Read from Databricks integration best practises about the native support for Service Principal authentication on JDBC/ODBC drivers. The timetable mentioned for this was "expected to land in 2023", is this referring to the https://docs.databricks.com/en/integrations/jdbc/authentication.html#oauth-machine-to-machine-m2m-au... - section of the authentication documentation?
The example shows parameters:

String url = "jdbc:databricks://<server-hostname>:443";
Properties p = new java.util.Properties();
p.put("httpPath", "<http-path>");
p.put("AuthMech", "11");
p.put("Auth_Flow", "1");
p.put("OAuth2ClientId", "<service-principal-application-id>");
p.put("OAuth2Secret", "<service-principal-oauth-secret>");

So assumably this works with Databricks Service Principals (on AWS), but does this function also in Azure with EntraID Service Principals? (at least the Tenant ID should be somehow provided still)

1 ACCEPTED SOLUTION

Accepted Solutions

daniel_sahal
Esteemed Contributor

@harripy 

As it says in the documentation, 

JDBC driver 2.6.36 and above supports Azure Databricks OAuth secrets for OAuth M2M or OAuth 2.0 client credentials authentication. Microsoft Entra ID secrets are not supported.


https://learn.microsoft.com/en-us/azure/databricks/integrations/jdbc/authentication#--oauth-machine-...

View solution in original post

2 REPLIES 2

daniel_sahal
Esteemed Contributor

@harripy 

As it says in the documentation, 

JDBC driver 2.6.36 and above supports Azure Databricks OAuth secrets for OAuth M2M or OAuth 2.0 client credentials authentication. Microsoft Entra ID secrets are not supported.


https://learn.microsoft.com/en-us/azure/databricks/integrations/jdbc/authentication#--oauth-machine-...

harripy
New Contributor III

Thanks for pointing this out, so indeed the OAuth M2M should be executed only with Databricks Service Principals.

Interestingly, I found out that on Azure Databricks SQL Warehouse Permissions can not be set (at least not through GUI) for a Databricks Service Principal (the SP can not be found on the Permissions menu), so this rules out this possibility to utilise this connection method as "Can Use/Can Manage" permissions can not be given for the Databricks SP. Or is there another way to provide this?