cancel
Showing results forย 
Search instead forย 
Did you mean:ย 
Data Engineering
Join discussions on data engineering best practices, architectures, and optimization strategies within the Databricks Community. Exchange insights and solutions with fellow data engineers.
cancel
Showing results forย 
Search instead forย 
Did you mean:ย 

Passing Microsoft MFA Auth from Databricks to MSSQL Managed Instance in a Databricks FastAPI App

SObiero
New Contributor

I have a Databricks App built using FastAPI. Users access this App after authenticating with Microsoft MFA on Databricks Azure Cloud. The App connects to an MSSQL Managed Instance (MI) that also supports Microsoft MFA.

I want the authenticated user's credentials to be passed to the App and used for database connections. In the development environment, this works fine since I am logged in with my Microsoft MFA. Can this be achieved in a Databricks App?

Below is a snippet of the code I am using with SQLAlchemy and PyODBC:

from sqlalchemy import create_engine, event, text
from azure.identity import DefaultAzureCredential
import struct
import pyodbc

try:
azure_credentials = DefaultAzureCredential()

TOKEN_URL = "https://database.windows.net/"
SQL_COPT_SS_ACCESS_TOKEN = 1256

server = 'azsql-mi-xyz.database.windows.net'
port = "1234"
database = 'database'
driver = "ODBC Driver 18 for SQL Server"

# Generate token
raw_token = azure_credentials.get_token(TOKEN_URL).token.encode("utf-16-le")

conn_str = f"mssql+pyodbc://@{server}:{port}/{database}?driver={driver}"
engine = create_engine(conn_str)

@event.listens_for(engine, "do_connect")
def provide_token(dialect, conn_rec, cargs, cparams):
# Remove "Trusted_Connection" parameter added by SQLAlchemy
cargs[0] = cargs[0].replace(";Trusted_Connection=Yes", "")

# Generate token credential
raw_token = azure_credentials.get_token(TOKEN_URL).token.encode("utf-16-le")
token_struct = struct.pack(f"<I{len(raw_token)}s", len(raw_token), raw_token)

# Apply it to connection parameters
cparams["attrs_before"] = {SQL_COPT_SS_ACCESS_TOKEN: token_struct}

# Test the connection
with engine.connect() as connection:
query = text("SELECT * FROM tablex")
rows = connection.execute(query).fetchall()

data = [dict(row._mapping) for row in rows]
return {"data": data}

except Exception as e:
raise HTTPException(status_code=500, detail=f"General Error: {str(e)}")
1 REPLY 1

mark_ott
Databricks Employee
Databricks Employee

It is not possible in Databricks to seamlessly pass each authenticated user's Azure/MS identity from a web app running on Databricks to MSSQL MI for per-user MFA authentication, in the way your development code does. This limitation stems from how identities and tokens are managed inside Databricks clusters and with web app (FastAPI) authentication flows. Hereโ€™s a clear breakdown:

Why Per-User MFA Passthrough Isnโ€™t Feasible in Databricks

  • Databricks User Context:
    When a user accesses your FastAPI app served on Databricks, they authenticate via Databricksโ€™ own MFA layer. However, when code runs on a Databricks cluster (whether it's a notebook, job, or a web app with Flask/FastAPI), it runs under a cluster identity or service principalโ€”not the original userโ€™s identity. Thereโ€™s currently no official way to propagate the user's Azure AD token from the frontend to the backend kernel in Databricks, as each notebook or app session is scoped to the Databricks execution context, not the user's Azure session.

  • DefaultAzureCredential Behavior:
    In your development environment, DefaultAzureCredential is able to use your desktop login session (with MFA) to get tokens. In Databricks, it will fall back to the managed identity of the Databricks workspace or the specific cluster/service principal assigned by your administrator. This means every database call from your app running on Databricks will use the same identity, regardless of which end user is making the request.

  • No Transparent Token Forwarding:
    There is no mechanism to get the userโ€™s MFA token upon web login and use it server-side in Databricks for direct SQL authenticationโ€”OAuth flows and browser MFA sessions do not cross this boundary.

What This Means for Your Architecture

Your current approach works locally because your machine holds your Azure AD credentials and MFA session. In the Databricks runtime, authentication context is different:

  • All back-end code runs as the cluster/managed identity.

  • The only credentials available are those granted to Databricks' identity, not the connecting user.

Workarounds and Best Practices

If per-user database auditing or row-level security is required, consider these alternatives:

  • Centralized Service Account:
    Use the Databricks workspace or a dedicated service principal to access MSSQL MI, applying necessary RBAC in SQL based on the application's logic of the "current user".

  • Impersonation Patterns:
    Pass the Databricks user's identity in your appโ€™s requests, and enforce access controls in your database or within the app logic, not at the SQL connection layer.

  • Custom OAuth2 Flows:
    It's theoretically possible to implement a custom OAuth2 flow in your FastAPI app so the user's browser authenticates with Azure AD, and the resulting token is passed to your FastAPI backend to be used for database access. However, this is complex, hard to secure, and rarely supported for direct SQL connection libraries due to how short-lived, single-user, and non-interactive tokens work with SQL.

Example: Centralized Token Connection in Databricks

The typical pattern in Databricks is to authenticate ONCE using the managed identity, and all DB traffic from the app uses that:

python
from azure.identity import DefaultAzureCredential import struct import pyodbc from sqlalchemy import create_engine TOKEN_URL = "https://database.windows.net/" SQL_COPT_SS_ACCESS_TOKEN = 1256 # running **inside Databricks**, this is the instance's managed identity azure_credentials = DefaultAzureCredential() raw_token = azure_credentials.get_token(TOKEN_URL).token.encode("utf-16-le") token_struct = struct.pack("<I%ds" % len(raw_token), len(raw_token), raw_token) conn_str = f"DRIVER={{ODBC Driver 18 for SQL Server}};SERVER=azsql-mi-xyz.database.windows.net,1234;DATABASE=database" engine = create_engine(f"mssql+pyodbc:///?odbc_connect={conn_str}") @event.listens_for(engine, "do_connect") def receive_do_connect(dialect, conn_rec, cargs, cparams): cparams['attrs_before'] = { SQL_COPT_SS_ACCESS_TOKEN: token_struct }

All connections will now use the Databricks-managed/service principal identity.

 

References

Summary Table

Access Scenario Supported in Databricks? How Identity is Determined
Local Dev + MFA Yes Active desktop user's Azure AD token
Databricks Web App + MFA No (per-user not possible) Cluster/service principal identity
Custom OAuth for DB Complex/uncommon Requires significant custom engineering
 
 

Key Takeaway

You cannot natively forward MFA-authenticated user identity from a Databricks web app to an MSSQL Managed Instance. All database access from Databricks will be under the workspace or cluster identity, not individual users, in the default and supported architecture.

Join Us as a Local Community Builder!

Passionate about hosting events and connecting people? Help us grow a vibrant local communityโ€”sign up today to get started!

Sign Up Now