02-08-2025 03:53 AM
Hello community.
I need help to define the most suitable approach for Unity Catalog. I have the following storage architecture in Azure Data Lake Storage.
I need to read, write and work with the Data from Databricks. The Azure data storage gets updated with new data on a daily basis.
What would be the best approach considering catalogs, schemas, external volumes, tables and so on?
Thanks in advance.
02-08-2025 02:25 PM
Hello @mbravonxp,
Create a separate catalog for each client to logically isolate their data. This helps in managing permissions and organizing data efficiently.
Within each catalog, create schemas for each environment (dev, pre, pro). This will help in managing the data lifecycle and access control for different stages of development
Use Databricks jobs or workflows to automate the data processing and updating of tables in the bronze, silver, and gold layers
Best Practices:
02-10-2025 11:41 PM - edited 02-10-2025 11:42 PM
Hi @mbravonxp ,
in this case the best approach is to have a single catalog per client per environment (so 9 catalogs in total).
In every catalog you will create bronze, silver and gold schema.
Additionally every catalog will have a separate storage and also, you may consider to have a separate workspace for each client for each environment.
Check my answer to the similar topic on the forum:
https://community.databricks.com/t5/community-platform-discussions/unity-catalog-implementation/td-p...
02-08-2025 02:25 PM
Hello @mbravonxp,
Create a separate catalog for each client to logically isolate their data. This helps in managing permissions and organizing data efficiently.
Within each catalog, create schemas for each environment (dev, pre, pro). This will help in managing the data lifecycle and access control for different stages of development
Use Databricks jobs or workflows to automate the data processing and updating of tables in the bronze, silver, and gold layers
Best Practices:
02-10-2025 11:41 PM - edited 02-10-2025 11:42 PM
Hi @mbravonxp ,
in this case the best approach is to have a single catalog per client per environment (so 9 catalogs in total).
In every catalog you will create bronze, silver and gold schema.
Additionally every catalog will have a separate storage and also, you may consider to have a separate workspace for each client for each environment.
Check my answer to the similar topic on the forum:
https://community.databricks.com/t5/community-platform-discussions/unity-catalog-implementation/td-p...
02-12-2025 12:25 AM
Hi both,
Thanks very much for the useful replies. Definitely I will go for your suggestions.
Best.
Join a Regional User Group to connect with local Databricks users. Events will be happening in your city, and you won’t want to miss the chance to attend and share knowledge.
If there isn’t a group near you, start one and help create a community that brings people together.
Request a New Group