cancel
Showing results for 
Search instead for 
Did you mean: 
Data Engineering
Join discussions on data engineering best practices, architectures, and optimization strategies within the Databricks Community. Exchange insights and solutions with fellow data engineers.
cancel
Showing results for 
Search instead for 
Did you mean: 

Migrating to Unity Catalog: Read-Only Connections to SQL Server and Snowflake

Direo
Contributor

We are in the process of migrating to Unity Catalog, establishing connections to SQL Server and Snowflake, and creating foreign catalogs that mirror your SQL Server and Snowflake databases. This allows us to leverage Unity Catalog’s query syntax and data governance tools to manage Databricks user access.

However, these features are read-only. This raises an important question: What solutions are people using when there’s a need to write back to SQL Server or Snowflake?

So far, we’ve used JDBC for writing, but this approach lacks the governance provided by Unity Catalog for reads and still requires using Key Vaults for credential management. And if you have different user groups with different access levels, that creates issues as you need a keyvault for each access group or in some casess even single user.

Is there a better way to write to SQL Server from Databricks that offers the same level of governance and control as Unity Catalog provides for reads or writes to data lakes?

Do you know whether this feature is in the Databricks plans?

We’d love to hear your thoughts!

1 REPLY 1

brian999
Contributor

We just use SQLAlchemy for connection to Snowflake, which, you're right, does not enable databricks governance.

Connect with Databricks Users in Your Area

Join a Regional User Group to connect with local Databricks users. Events will be happening in your city, and you won’t want to miss the chance to attend and share knowledge.

If there isn’t a group near you, start one and help create a community that brings people together.

Request a New Group