- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
09-13-2022 02:25 AM
The curated zone is pushed to cloud data warehouse such as Synapse Dedicated SQL Pools which then acts as a serving layer for BI tools and analyst.
I believe we can have models in gold layer and have BI connect to this layer or we can have serverless infra for adhoc querying
- Labels:
-
Azure
-
Azure synapse
-
Delta
-
Use Case
Accepted Solutions
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
09-13-2022 02:46 AM
Not necessarily. Databricks SQL can also be very fast. So I'd do a comparison on performance/cost for one of your 'heavy' workloads.
And do not forget to take the cost of copying data into account (which you do not have to do in case of databricks SQL).
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
09-13-2022 02:37 AM
In the past I would have said: yes because running queries on a spark cluster is not that fast.
But right now, with Databricks SQL I don't immediately see the added value.
A use case could be proprietary login method (SQL Server login f..e.), or more tuning options (table type f.e.).
It can also be that Synapse is cheaper/faster than Databricks SQL for your use case.
So in general I'd say that there is no reason to use Synapse, but it might be possible that for your use case it is a good choice.
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
09-13-2022 02:41 AM
Thank you, so for a large workload, where we need lot of optimization we might need Synapse, but for a small/medium workload, we might have to stick to Delta Table
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
09-13-2022 02:46 AM
Not necessarily. Databricks SQL can also be very fast. So I'd do a comparison on performance/cost for one of your 'heavy' workloads.
And do not forget to take the cost of copying data into account (which you do not have to do in case of databricks SQL).

