Federated connections from Azure Databricks to Azure SQL DB via Lakehouse Federation currently only support read-only queriesโmeaning running update commands or executing stored procedures directly through the federated Unity Catalog interface is not supported as of late 2025. This limitation means that operations such as UPDATE statements or invoking stored procedures must be performed through other means, not through the federated SQL interface in Databricks.โ
Federated Capabilities and Limitations
-
Lakehouse Federation allows connecting and querying data across SQL Server, Azure SQL Database, and other sources using read-only SQL commands for analytics purposes.โ
-
Write operations, such as insert, update, or delete, and direct execution of stored procedures are typically restricted or unsupported in federated mode.โ
-
The only writeable exception applies to legacy Hive metastore catalog federation, where foreign tables can be writeableโnot relevant to Azure SQL DB federation.โ
Alternative Methods for Updates and Procedures
-
To execute update statements or stored procedures, use direct SQL connections (ODBC, JDBC) from Databricks notebooks or jobs, rather than federated queries.โ
-
This involves installing appropriate drivers (such as pyodbc for Python) and connecting directly to the Azure SQL DB; stored procedures can then be executed using standard SQL commands within your notebook or code.โ
-
Example (Python/pyodbc):
import pyodbc
conn = pyodbc.connect(conn_str)
cursor = conn.cursor()
cursor.execute("EXEC YourStoredProcedureName @param1 = ?, @param2 = ?", (value1, value2))
conn.commit()
Recent Developments
-
Despite some improvements in Databricks SQL and Lakehouse Federation features throughout 2025, updates about full DML (Data Manipulation Language) support for federated SQL DBs have not been announced; federated actions remain read-only except for Hive-metastore scenarios.โ
Summary:
Federated queries to Azure SQL DB from Databricks are read-only and do not support update commands or stored procedure execution. For these actions, establish a direct database connection using ODBC/JDBC from your Databricks notebook or workflow.โ