- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
05-20-2022 10:48 AM
Hi,
I would like to deploy Databricks workspaces to build a delta lakehouse to server both scheduled jobs/processing and ad-hoc/analytical querying workloads. Databricks users comprise of both data engineers and data analysts.
In terms of requirements in addition to optimising costs, I would like to take advantage of the Premium tier's role-based access and credential passthrough, primarily to ensure our data analyst access adhere to the "principle of least privilege" aka not admins. I don't want the analysts tinkering with workspace, table and cluster objects & configurations.
On this basis, rather than a single Premium tier, is it a viable approach to setup 2x Databricks workspaces? A Standard workspace to run scheduled jobs/workflows and a Premium, more secure, workspace for the analysts to run their ad-hoc queries.
Pros
- Scheduled jobs/workflows exclusively on the Standard SKU which is cheaper than on a Premium SKU.
- Separate workloads delineates billing, if we want to distinguish between data engineering and data analytical workloads
Cons
- Slightly more operational and admin overhead in setting up and managing two workspaces as opposed to a single workspace.
Thanks
Tim
Accepted Solutions
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
05-21-2022 02:55 AM
@Timothy Lin , Yes, exactly what you wrote is the correct approach. Having two workspaces is the way to go. In cons, I only see that you need to automatically set a common hive meta store for both to have the same tables in both spaces. You can also request Databricks support for help in integrating two workspaces.
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
05-21-2022 02:55 AM
@Timothy Lin , Yes, exactly what you wrote is the correct approach. Having two workspaces is the way to go. In cons, I only see that you need to automatically set a common hive meta store for both to have the same tables in both spaces. You can also request Databricks support for help in integrating two workspaces.
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
05-25-2022 12:54 AM
Hi all thank you for informative answers!

