Databricks bundles - good practice for multiprocessing envs
Options
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
04-29-2024 07:17 PM
I'm seeking advice regarding Databricks bundles. In my scenario, I have multiple production environments where I aim to execute the same DLT. To simplify, let's assume the DLT reads data from 'eventhub-region-name,' with this being the only differing factor. Therefore, I can parameterize a variable and utilize it as a configuration parameter in the DLT definition.
However, I'm unsure about the best practice for targeting. Both environments are production, and I prefer to keep the code simple. Should I define something like 'prod-1' and 'prod-2' as targets, or is there a better approach?
0 REPLIES 0

