cancel
Showing results forย 
Search instead forย 
Did you mean:ย 
Administration & Architecture
Explore discussions on Databricks administration, deployment strategies, and architectural best practices. Connect with administrators and architects to optimize your Databricks environment for performance, scalability, and security.
cancel
Showing results forย 
Search instead forย 
Did you mean:ย 

Connect to a SQL Server Database with Windows Authentication

Adam_Borlase
New Contributor III

Good Day all, 

I am in the process of trying to connect to one of our SQL servers. It is attached to our Entra for authentication. When trying to create an external connection to the Server in Unity I am getting a failure due to the User and Password authentication (I am not getting a connection error but a login error). Is it possible to create this connection with the Windows Authentication?

1 ACCEPTED SOLUTION

Accepted Solutions

nayan_wylde
Esteemed Contributor

@Adam_Borlase Can you try this steps to see there is no network issue.

Use SQL Authentication

  • Create a SQL Server login (not Entra ID) with a username and password.
  • Grant it access to the required database.
  • Use this credential in Unity Catalog's external connection.

View solution in original post

4 REPLIES 4

nayan_wylde
Esteemed Contributor

Unity Catalog currently support:

SQL authentication (username/password)
Managed identities (for Azure services like Azure SQL Database or Azure Synapse)
OAuth-based authentication (for some cloud services)

Windows Authentication / Entra ID Integrated Authentication is not natively supported for external connections to on-prem SQL Server or Azure SQL from Unity Catalog.

Can you please paste the error you are getting while trying to pass user and password.

Good Afternoon Naya,_wylde,

Thank you for your response. I have pasted the error below, it is a bit long but I was unsure what section you wanted to see:

summary: SQLServerException: Login failed for user #########. ClientConnectionId:############, data: {"type":"baseError","stackFrames":["com.databricks.backend.common.rpc.SparkDriverExceptions$SQLExecutionException: com.microsoft.sqlserver.jdbc.SQLServerException: Login failed for user '##########'. ClientConnectionId:#########","\tat com.microsoft.sqlserver.jdbc.SQLServerException.makeFromDatabaseError(SQLServerException.java:265)","\tat com.microsoft.sqlserver.jdbc.TDSTokenHandler.onEOF(tdsparser.java:300)","\tat com.microsoft.sqlserver.jdbc.TDSParser.parse(tdsparser.java:133)","\tat com.microsoft.sqlserver.jdbc.TDSParser.parse(tdsparser.java:37)","\tat com.microsoft.sqlserver.jdbc.SQLServerConnection.sendLogon(SQLServerConnection.java:6299)","\tat com.microsoft.sqlserver.jdbc.SQLServerConnection.logon(SQLServerConnection.java:4911)","\tat com.microsoft.sqlserver.jdbc.SQLServerConnection.access$100(SQLServerConnection.java:90)","\tat com.microsoft.sqlserver.jdbc.SQLServerConnection$LogonCommand.doExecute(SQLServerConnection.java:4849)","\tat com.microsoft.sqlserver.jdbc.TDSCommand.execute(IOBuffer.java:7627)","\tat com.microsoft.sqlserver.jdbc.SQLServerConnection.executeCommand(SQLServerConnection.java:3916)","\tat com.microsoft.sqlserver.jdbc.SQLServerConnection.connectHelper(SQLServerConnection.java:3358)","\tat com.microsoft.sqlserver.jdbc.SQLServerConnection.login(SQLServerConnection.java:2950)","\tat com.microsoft.sqlserver.jdbc.SQLServerConnection.connectInternal(SQLServerConnection.java:2790)","\tat com.microsoft.sqlserver.jdbc.SQLServerConnection.connect(SQLServerConnection.java:1663)","\tat com.microsoft.sqlserver.jdbc.SQLServerDriver.connect(SQLServerDriver.java:1064)","\tat org.apache.spark.sql.execution.datasources.jdbc.connection.BasicConnectionProvider.getConnection(BasicConnectionProvider.scala:49)","\tat org.apache.spark.sql.execution.datasources.jdbc.connection.ConnectionProviderBase.create(ConnectionProvider.scala:102)","\tat org.apache.spark.sql.jdbc.JdbcDialect.$anonfun$createConnectionFactory$1(JdbcDialects.scala:179)","\tat org.apache.spark.sql.jdbc.JdbcDialect.$anonfun$createConnectionFactory$1$adapted(JdbcDialects.scala:175)","\tat org.apache.spark.sql.execution.datasources.jdbc.JdbcUtils$.withConnection(JdbcUtils.scala:1221)","\tat org.apache.spark.sql.execution.datasources.v2.jdbc.JDBCTableCatalog.listNamespaces(JDBCTableCatalog.scala:226)","\tat com.databricks.sql.managedcatalog.ManagedCatalogCommon.listSchemas(ManagedCatalogCommon.scala:816)","\tat com.databricks.sql.managedcatalog.ProfiledManagedCatalog.$anonfun$listSchemas$1(ProfiledManagedCatalog.scala:156)","\tat org.apache.spark.sql.catalyst.MetricKeyUtils$.measure(MetricKey.scala:1055)","\tat com.databricks.sql.managedcatalog.ProfiledManagedCatalog.$anonfun$profile$1(ProfiledManagedCatalog.scala:63)","\tat com.databricks.spark.util.FrameProfiler$.record(FrameProfiler.scala:94)","\tat com.databricks.sql.managedcatalog.ProfiledManagedCatalog.profile(ProfiledManagedCatalog.scala:62)","\tat com.databricks.sql.managedcatalog.ProfiledManagedCatalog.listSchemas(ProfiledManagedCatalog.scala:156)","\tat com.databricks.sql.managedcatalog.ManagedCatalogSessionCatalog.listDatabasesWithCatalog(ManagedCatalogSessionCatalog.scala:892)","\tat com.databricks.sql.managedcatalog.UnityCatalogV2Proxy.listNamespaces(UnityCatalogV2Proxy.scala:130)","\tat org.apache.spark.sql.execution.datasources.v2.ShowNamespacesExec.run(ShowNamespacesExec.scala:42)","\tat org.apache.spark.sql.execution.datasources.v2.V2CommandExec.$anonfun$result$2(V2CommandExec.scala:48)","\tat org.apache.spark.sql.execution.SparkPlan.runCommandWithAetherOff(SparkPlan.scala:178)","\tat org.apache.spark.sql.execution.SparkPlan.runCommandInAetherOrSpark(SparkPlan.scala:189)","\tat org.apache.spark.sql.execution.datasources.v2.V2CommandExec.$anonfun$result$1(V2CommandExec.scala:48)","\tat com.databricks.spark.util.FrameProfiler$.record(FrameProfiler.scala:94)","\tat org.apache.spark.sql.execution.datasources.v2.V2CommandExec.result$lzycompute(V2CommandExec.scala:47)","\tat org.apache.spark.sql.execution.datasources.v2.V2CommandExec.result(V2CommandExec.scala:45)","\tat org.apache.spark.sql.execution.datasources.v2.V2CommandExec.executeCollect(V2CommandExec.scala:56)","\tat org.apache.spark.sql.execution.QueryExecution$$anonfun$$nestedInanonfun$eagerlyExecuteCommands$1$1.$anonfun$applyOrElse$4(QueryExecution.scala:391)","\tat org.apache.spark.sql.catalyst.QueryPlanningTracker$.withTracker(QueryPlanningTracker.scala:168)","\tat org.apache.spark.sql.execution.QueryExecution$$anonfun$$nestedInanonfun$eagerlyExecuteCommands$1$1.$anonfun$applyOrElse$3(QueryExecution.scala:391)","\tat org.apache.spark.sql.execution.SQLExecution$.$anonfun$withNewExecutionId0$9(SQLExecution.scala:408)","\tat org.apache.spark.sql.execution.SQLExecution$.withSQLConfPropagated(SQLExecution.scala:754)","\tat org.apache.spark.sql.execution.SQLExecution$.$anonfun$withNewExecutionId0$1(SQLExecution.scala:280)","\tat org.apache.spark.sql.SparkSession.withActive(SparkSession.scala:1175)","\tat org.apache.spark.sql.execution.SQLExecution$.withNewExecutionId0(SQLExecution.scala:169)","\tat org.apache.spark.sql.execution.SQLExecution$.withNewExecutionId(SQLExecution.scala:691)","\tat org.apache.spark.sql.execution.QueryExecution$$anonfun$$nestedInanonfun$eagerlyExecuteCommands$1$1.$anonfun$applyOrElse$2(QueryExecution.scala:390)","\tat org.apache.spark.sql.execution.QueryExecution$.withInternalError(QueryExecution.scala:1203)","\tat org.apache.spark.sql.execution.QueryExecution$$anonfun$$nestedInanonfun$eagerlyExecuteCommands$1$1.$anonfun$applyOrElse$1(QueryExecution.scala:386)","\tat org.apache.spark.sql.execution.QueryExecution.org$apache$spark$sql$execution$QueryExecution$$withMVTagsIfNecessary(QueryExecution.scala:326)","\tat org.apache.spark.sql.execution.QueryExecution$$anonfun$$nestedInanonfun$eagerlyExecuteCommands$1$1.applyOrElse(QueryExecution.scala:383)","\tat org.apache.spark.sql.execution.QueryExecution$$anonfun$$nestedInanonfun$eagerlyExecuteCommands$1$1.applyOrElse(QueryExecution.scala:367)","\tat org.apache.spark.sql.catalyst.trees.TreeNode.$anonfun$transformDownWithPruning$1(TreeNode.scala:505)","\tat org.apache.spark.sql.catalyst.trees.CurrentOrigin$.withOrigin(origin.scala:83)","\tat org.apache.spark.sql.catalyst.trees.TreeNode.transformDownWithPruning(TreeNode.scala:505)","\tat org.apache.spark.sql.catalyst.plans.logical.LogicalPlan.org$apache$spark$sql$catalyst$plans$logical$AnalysisHelper$$super$transformDownWithPruning(LogicalPlan.scala:39)","\tat org.apache.spark.sql.catalyst.plans.logical.AnalysisHelper.transformDownWithPruning(AnalysisHelper.scala:343)","\tat org.apache.spark.sql.catalyst.plans.logical.AnalysisHelper.transformDownWithPruning$(AnalysisHelper.scala:339)","\tat org.apache.spark.sql.catalyst.plans.logical.LogicalPlan.transformDownWithPruning(LogicalPlan.scala:39)","\tat org.apache.spark.sql.catalyst.plans.logical.LogicalPlan.transformDownWithPruning(LogicalPlan.scala:39)","\tat org.apache.spark.sql.catalyst.trees.TreeNode.transformDown(TreeNode.scala:481)","\tat org.apache.spark.sql.execution.QueryExecution.$anonfun$eagerlyExecuteCommands$1(QueryExecution.scala:367)","\tat org.apache.spark.sql.catalyst.plans.logical.AnalysisHelper$.allowInvokingTransformsInAnalyzer(AnalysisHelper.scala:400)","\tat org.apache.spark.sql.execution.QueryExecution.eagerlyExecuteCommands(QueryExecution.scala:367)","\tat org.apache.spark.sql.execution.QueryExecution.commandExecuted$lzycompute(QueryExecution.scala:285)","\tat org.apache.spark.sql.execution.QueryExecution.commandExecuted(QueryExecution.scala:282)","\tat org.apache.spark.sql.Dataset.<init>(Dataset.scala:289)","\tat org.apache.spark.sql.Dataset$.$anonfun$ofRows$2(Dataset.scala:127)","\tat org.apache.spark.sql.SparkSession.withActive(SparkSession.scala:1175)","\tat org.apache.spark.sql.SparkSession.$anonfun$withActiveAndFrameProfiler$1(SparkSession.scala:1182)","\tat com.databricks.spark.util.FrameProfiler$.record(FrameProfiler.scala:94)","\tat org.apache.spark.sql.SparkSession.withActiveAndFrameProfiler(SparkSession.scala:1182)","\tat org.apache.spark.sql.Dataset$.ofRows(Dataset.scala:116)","\tat org.apache.spark.sql.SparkSession.$anonfun$sql$4(SparkSession.scala:954)","\tat org.apache.spark.sql.SparkSession.withActive(SparkSession.scala:1175)","\tat org.apache.spark.sql.SparkSession.sql(SparkSession.scala:942)","\tat org.apache.spark.sql.SparkSession.sql(SparkSession.scala:977)","\tat org.apache.spark.sql.SparkSession.sql(SparkSession.scala:1010)","\tat org.apache.spark.sql.SQLContext.sql(SQLContext.scala:696)","\tat com.databricks.backend.daemon.driver.DriverLocal$DbClassicStrategy.executeSQLQuery(DriverLocal.scala:312)","\tat com.databricks.backend.daemon.driver.DriverLocal.executeSQLSubCommand(DriverLocal.scala:402)","\tat com.databricks.backend.daemon.driver.DriverLocal.$anonfun$executeSql$1(DriverLocal.scala:423)","\tat scala.collection.immutable.List.map(List.scala:297)","\tat com.databricks.backend.daemon.driver.DriverLocal.executeSql(DriverLocal.scala:418)","\tat com.databricks.backend.daemon.driver.SQLDriverLocal.repl(SQLDriverLocal.scala:40)","\tat com.databricks.backend.daemon.driver.DriverLocal.$anonfun$execute$33(DriverLocal.scala:1169)","\tat com.databricks.unity.UCSEphemeralState$Handle.runWith(UCSEphemeralState.scala:45)","\tat com.databricks.unity.HandleImpl.runWith(UCSHandle.scala:104)","\tat com.databricks.backend.daemon.driver.DriverLocal.$anonfun$execute$28(DriverLocal.scala:1160)","\tat com.databricks.logging.UsageLogging.$anonfun$withAttributionContext$1(UsageLogging.scala:426)","\tat scala.util.DynamicVariable.withValue(DynamicVariable.scala:62)","\tat com.databricks.logging.AttributionContext$.withValue(AttributionContext.scala:216)","\tat com.databricks.logging.UsageLogging.withAttributionContext(UsageLogging.scala:424)","\tat com.databricks.logging.UsageLogging.withAttributionContext$(UsageLogging.scala:418)","\tat com.databricks.backend.daemon.driver.DriverLocal.withAttributionContext(DriverLocal.scala:100)","\tat com.databricks.logging.UsageLogging.withAttributionTags(UsageLogging.scala:472)","\tat com.databricks.logging.UsageLogging.withAttributionTags$(UsageLogging.scala:455)","\tat com.databricks.backend.daemon.driver.DriverLocal.withAttributionTags(DriverLocal.scala:100)","\tat com.databricks.backend.daemon.driver.DriverLocal.$anonfun$execute$1(DriverLocal.scala:1096)","\tat com.databricks.backend.daemon.driver.DriverLocal$.$anonfun$maybeSynchronizeExecution$4(DriverLocal.scala:1557)","\tat com.databricks.backend.daemon.driver.DriverLocal.execute(DriverLocal.scala:773)","\tat com.databricks.backend.daemon.driver.DriverWrapper.$anonfun$tryExecutingCommand$2(DriverWrapper.scala:795)","\tat scala.util.Try$.apply(Try.scala:213)","\tat com.databricks.backend.daemon.driver.DriverWrapper.$anonfun$tryExecutingCommand$1(DriverWrapper.scala:787)","\tat com.databricks.backend.daemon.driver.DriverWrapper.$anonfun$tryExecutingCommand$3(DriverWrapper.scala:827)","\tat com.databricks.logging.UsageLogging.executeThunkAndCaptureResultTags$1(UsageLogging.scala:669)","\tat com.databricks.logging.UsageLogging.$anonfun$recordOperationWithResultTags$4(UsageLogging.scala:687)","\tat com.databricks.logging.UsageLogging.$anonfun$withAttributionContext$1(UsageLogging.scala:426)","\tat scala.util.DynamicVariable.withValue(DynamicVariable.scala:62)","\tat com.databricks.logging.AttributionContext$.withValue(AttributionContext.scala:216)","\tat com.databricks.logging.UsageLogging.withAttributionContext(UsageLogging.scala:424)","\tat com.databricks.logging.UsageLogging.withAttributionContext$(UsageLogging.scala:418)","\tat com.databricks.backend.daemon.driver.DriverWrapper.withAttributionContext(DriverWrapper.scala:72)","\tat com.databricks.logging.UsageLogging.withAttributionTags(UsageLogging.scala:472)","\tat com.databricks.logging.UsageLogging.withAttributionTags$(UsageLogging.scala:455)","\tat com.databricks.backend.daemon.driver.DriverWrapper.withAttributionTags(DriverWrapper.scala:72)","\tat com.databricks.logging.UsageLogging.recordOperationWithResultTags(UsageLogging.scala:664)","\tat com.databricks.logging.UsageLogging.recordOperationWithResultTags$(UsageLogging.scala:582)","\tat com.databricks.backend.daemon.driver.DriverWrapper.recordOperationWithResultTags(DriverWrapper.scala:72)","\tat com.databricks.backend.daemon.driver.DriverWrapper.tryExecutingCommand(DriverWrapper.scala:827)","\tat com.databricks.backend.daemon.driver.DriverWrapper.executeCommandAndGetError(DriverWrapper.scala:694)","\tat com.databricks.backend.daemon.driver.DriverWrapper.executeCommand(DriverWrapper.scala:739)","\tat com.databricks.backend.daemon.driver.DriverWrapper.$anonfun$runInnerLoop$1(DriverWrapper.scala:569)","\tat scala.runtime.java8.JFunction0$mcV$sp.apply(JFunction0$mcV$sp.java:23)","\tat com.databricks.logging.UsageLogging.$anonfun$withAttributionContext$1(UsageLogging.scala:426)","\tat scala.util.DynamicVariable.withValue(DynamicVariable.scala:62)","\tat com.databricks.logging.AttributionContext$.withValue(AttributionContext.scala:216)","\tat com.databricks.logging.UsageLogging.withAttributionContext(UsageLogging.scala:424)","\tat com.databricks.logging.UsageLogging.withAttributionContext$(UsageLogging.scala:418)","\tat com.databricks.backend.daemon.driver.DriverWrapper.withAttributionContext(DriverWrapper.scala:72)","\tat com.databricks.backend.daemon.driver.DriverWrapper.runInnerLoop(DriverWrapper.scala:569)","\tat com.databricks.backend.daemon.driver.DriverWrapper.runInner(DriverWrapper.scala:491)","\tat com.databricks.backend.daemon.driver.DriverWrapper.run(DriverWrapper.scala:292)","\tat java.lang.Thread.run(Thread.java:750)","\tat com.databricks.backend.daemon.driver.DriverLocal.executeSql(DriverLocal.scala:475)","\tat com.databricks.backend.daemon.driver.SQLDriverLocal.repl(SQLDriverLocal.scala:40)","\tat com.databricks.backend.daemon.driver.DriverLocal.$anonfun$execute$33(DriverLocal.scala:1169)","\tat com.databricks.unity.UCSEphemeralState$Handle.runWith(UCSEphemeralState.scala:45)","\tat com.databricks.unity.HandleImpl.runWith(UCSHandle.scala:104)","\tat com.databricks.backend.daemon.driver.DriverLocal.$anonfun$execute$28(DriverLocal.scala:1160)","\tat com.databricks.logging.UsageLogging.$anonfun$withAttributionContext$1(UsageLogging.scala:426)","\tat scala.util.DynamicVariable.withValue(DynamicVariable.scala:62)","\tat com.databricks.logging.AttributionContext$.withValue(AttributionContext.scala:216)","\tat com.databricks.logging.UsageLogging.withAttributionContext(UsageLogging.scala:424)","\tat com.databricks.logging.UsageLogging.withAttributionContext$(UsageLogging.scala:418)","\tat com.databricks.backend.daemon.driver.DriverLocal.withAttributionContext(DriverLocal.scala:100)","\tat com.databricks.logging.UsageLogging.withAttributionTags(UsageLogging.scala:472)","\tat com.databricks.logging.UsageLogging.withAttributionTags$(UsageLogging.scala:455)","\tat com.databricks.backend.daemon.driver.DriverLocal.withAttributionTags(DriverLocal.scala:100)","\tat com.databricks.backend.daemon.driver.DriverLocal.$anonfun$execute$1(DriverLocal.scala:1096)","\tat com.databricks.backend.daemon.driver.DriverLocal$.$anonfun$maybeSynchronizeExecution$4(DriverLocal.scala:1557)","\tat com.databricks.backend.daemon.driver.DriverLocal.execute(DriverLocal.scala:773)","\tat com.databricks.backend.daemon.driver.DriverWrapper.$anonfun$tryExecutingCommand$2(DriverWrapper.scala:795)","\tat scala.util.Try$.apply(Try.scala:213)","\tat com.databricks.backend.daemon.driver.DriverWrapper.$anonfun$tryExecutingCommand$1(DriverWrapper.scala:787)","\tat com.databricks.backend.daemon.driver.DriverWrapper.$anonfun$tryExecutingCommand$3(DriverWrapper.scala:827)","\tat com.databricks.logging.UsageLogging.executeThunkAndCaptureResultTags$1(UsageLogging.scala:669)","\tat com.databricks.logging.UsageLogging.$anonfun$recordOperationWithResultTags$4(UsageLogging.scala:687)","\tat com.databricks.logging.UsageLogging.$anonfun$withAttributionContext$1(UsageLogging.scala:426)","\tat scala.util.DynamicVariable.withValue(DynamicVariable.scala:62)","\tat com.databricks.logging.AttributionContext$.withValue(AttributionContext.scala:216)","\tat com.databricks.logging.UsageLogging.withAttributionContext(UsageLogging.scala:424)","\tat com.databricks.logging.UsageLogging.withAttributionContext$(UsageLogging.scala:418)","\tat com.databricks.backend.daemon.driver.DriverWrapper.withAttributionContext(DriverWrapper.scala:72)","\tat com.databricks.logging.UsageLogging.withAttributionTags(UsageLogging.scala:472)","\tat com.databricks.logging.UsageLogging.withAttributionTags$(UsageLogging.scala:455)","\tat com.databricks.backend.daemon.driver.DriverWrapper.withAttributionTags(DriverWrapper.scala:72)","\tat com.databricks.logging.UsageLogging.recordOperationWithResultTags(UsageLogging.scala:664)","\tat com.databricks.logging.UsageLogging.recordOperationWithResultTags$(UsageLogging.scala:582)","\tat com.databricks.backend.daemon.driver.DriverWrapper.recordOperationWithResultTags(DriverWrapper.scala:72)","\tat com.databricks.backend.daemon.driver.DriverWrapper.tryExecutingCommand(DriverWrapper.scala:827)","\tat com.databricks.backend.daemon.driver.DriverWrapper.executeCommandAndGetError(DriverWrapper.scala:694)","\tat com.databricks.backend.daemon.driver.DriverWrapper.executeCommand(DriverWrapper.scala:739)","\tat com.databricks.backend.daemon.driver.DriverWrapper.$anonfun$runInnerLoop$1(DriverWrapper.scala:569)","\tat scala.runtime.java8.JFunction0$mcV$sp.apply(JFunction0$mcV$sp.java:23)","\tat com.databricks.logging.UsageLogging.$anonfun$withAttributionContext$1(UsageLogging.scala:426)","\tat scala.util.DynamicVariable.withValue(DynamicVariable.scala:62)","\tat com.databricks.logging.AttributionContext$.withValue(AttributionContext.scala:216)","\tat com.databricks.logging.UsageLogging.withAttributionContext(UsageLogging.scala:424)","\tat com.databricks.logging.UsageLogging.withAttributionContext$(UsageLogging.scala:418)","\tat com.databricks.backend.daemon.driver.DriverWrapper.withAttributionContext(DriverWrapper.scala:72)","\tat com.databricks.backend.daemon.driver.DriverWrapper.runInnerLoop(DriverWrapper.scala:569)","\tat com.databricks.backend.daemon.driver.DriverWrapper.runInner(DriverWrapper.scala:491)","\tat com.databricks.backend.daemon.driver.DriverWrapper.run(DriverWrapper.scala:292)","\tat java.lang.Thread.run(Thread.java:750)"],"arguments":{},"addedWidgets":{},"removedWidgets":[],"datasetInfos":[],"metadata":{},"jupyterProps":null,"sqlProps":null}

-werners-
Esteemed Contributor III

Windows authentication will never work on Databricks as it is bound to a Windows Domain (which is a local LAN concept).  So you will have to use another authentication method.

nayan_wylde
Esteemed Contributor

@Adam_Borlase Can you try this steps to see there is no network issue.

Use SQL Authentication

  • Create a SQL Server login (not Entra ID) with a username and password.
  • Grant it access to the required database.
  • Use this credential in Unity Catalog's external connection.