We are in the process of migrating to Unity Catalog, establishing connections to SQL Server and Snowflake, and creating foreign catalogs that mirror your SQL Server and Snowflake databases. This allows us to leverage Unity Catalogās query syntax and data governance tools to manage Databricks user access.
However, these features are read-only. This raises an important question: What solutions are people using when thereās a need to write back to SQL Server or Snowflake?
So far, weāve used JDBC for writing, but this approach lacks the governance provided by Unity Catalog for reads and still requires using Key Vaults for credential management. And if you have different user groups with different access levels, that creates issues as you need a keyvault for each access group or in some casess even single user.
Is there a better way to write to SQL Server from Databricks that offers the same level of governance and control as Unity Catalog provides for reads or writes to data lakes?
Do you know whether this feature is in the Databricks plans?
Weād love to hear your thoughts!