Showing results for 
Search instead for 
Did you mean: 
Data Engineering
Join discussions on data engineering best practices, architectures, and optimization strategies within the Databricks Community. Exchange insights and solutions with fellow data engineers.
Showing results for 
Search instead for 
Did you mean: 

Is databricks capable of housing OLTP and OLAP?

New Contributor

Hi data experts.

I currently have an OLTP (Azure SQL DB) that keeps data only for the past 14 days. We use Partition switching to achieve that and have an ETL (Azure data factory) process that feeds the Datawarehouse (Azure Synapse Analytics). My requirement is to migrate the OLTP and OLAP into the lakehouse within databricks. I will have a structured streaming from Eventhub to Bronze then have OLTP as silver and OLAP as Gold. Is this something that is possible within databricks. If so, can you suggest ways to achieve this possibly using delta tables or delta live tables. Thank you


Community Manager
Community Manager

Hi @TinasheChinyati, Migrating your OLTP and OLAP workloads into a lakehouse within Databricks is indeed possible. 


Load Data into the Lakehouse:

  • Databricks provides tools and capabilities to make data migration to the lakehouse seamless. You can load data from diverse sources into your lakehouse.
  • Consider migrating your existing Parquet data lake to Delta Lake, which offers improved reliability, performance, and schema evolution capabilities.
  • Explore Lakehouse Federation to query data across different storage layers (Bronze, Silver, Gold) within your lakehouse.

Databricks Data Intelligence Platform vs. Enterprise Data Warehouse:

  • The Databricks Data Intelligence Platform is built on top of Apache Spark™, Unity Catalog, and Delta Lake.
  • Key differences include:

Delta Live Tables:

In summary, Databricks offers a powerful platform for unifying your data ecosystem, allowing analysts, data scientists, and engineers to work against the same tables. Consider leveraging Delta tables and Delta Live Tables to achieve your lakehouse migration goals. 🚀

New Contributor II

Hi @Kaniz_Fatma I have looked at this topic extensively and have even tried to implement it.
I am a champion of databricks at my organization, but I do not think that it currently enables the OLTP scenarios.

The closest I have gotten to it is by using the Statement Execution REST API using a SQL warehouse as the compute. This avoids the 3~5 minute delay on cluster bootup.

But even with that approach, simple SELECT queries against delta tables are still executed in the order of 5~10 seconds where as any decent OLTP (Oracle, MSSQL, mysql) will perform one order of magnitude faster (under 500 ms).

Join 100K+ Data Experts: Register Now & Grow with Us!

Excited to expand your horizons with us? Click here to Register and begin your journey to success!

Already a member? Login and join your local regional user group! If there isn’t one near you, fill out this form and we’ll create one for you to join!