- You have a couple options to write data into a Data Warehouse. Some DWs have special connectors that allow for high performance between Databricks and the DW (for example there is a Spark connector for Snowflake and for Azure Synapse DW).
- Some data warehouses have the ability to connect to external stores so you can use those capabilities to read data from files in cloud storage. For example, you can use Synapse's polybase connector to read data from ADLS Gen2.
- Furthermore, Databricks has a general JDBC connection so if the DW supports a JDBC connection then that can be used as well.
In all scenarios it usually requires data being written from a Dataframe into either cloud storage or directly into a database table.
This process can be burdensome at times, so it is usually recommended to get your SQL users on Databricks SQL and treat your DW as a data source instead. This is usually easier to manage and saves on costs.