cancel
Showing results forย 
Search instead forย 
Did you mean:ย 
Machine Learning
Dive into the world of machine learning on the Databricks platform. Explore discussions on algorithms, model training, deployment, and more. Connect with ML enthusiasts and experts.
cancel
Showing results forย 
Search instead forย 
Did you mean:ย 

What is the use case of having Azure Synapse(DWH) and Delta Lake ( Gold) given we can connect BI to delta directly

Shuvi
New Contributor III

The curated zone is pushed to cloud data warehouse such as Synapse Dedicated SQL Pools which then acts as a serving layer for BI tools and analyst.

I believe we can have models in gold layer and have BI connect to this layer or we can have serverless infra for adhoc querying

1 ACCEPTED SOLUTION

Accepted Solutions

-werners-
Esteemed Contributor III

Not necessarily. Databricks SQL can also be very fast. So I'd do a comparison on performance/cost for one of your 'heavy' workloads.

And do not forget to take the cost of copying data into account (which you do not have to do in case of databricks SQL).

View solution in original post

3 REPLIES 3

-werners-
Esteemed Contributor III

In the past I would have said: yes because running queries on a spark cluster is not that fast.

But right now, with Databricks SQL I don't immediately see the added value.

A use case could be proprietary login method (SQL Server login f..e.), or more tuning options (table type f.e.).

It can also be that Synapse is cheaper/faster than Databricks SQL for your use case.

So in general I'd say that there is no reason to use Synapse, but it might be possible that for your use case it is a good choice.

Shuvi
New Contributor III

Thank you, so for a large workload, where we need lot of optimization we might need Synapse, but for a small/medium workload, we might have to stick to Delta Table

-werners-
Esteemed Contributor III

Not necessarily. Databricks SQL can also be very fast. So I'd do a comparison on performance/cost for one of your 'heavy' workloads.

And do not forget to take the cost of copying data into account (which you do not have to do in case of databricks SQL).

Connect with Databricks Users in Your Area

Join a Regional User Group to connect with local Databricks users. Events will be happening in your city, and you wonโ€™t want to miss the chance to attend and share knowledge.

If there isnโ€™t a group near you, start one and help create a community that brings people together.

Request a New Group