cancel
Showing results for 
Search instead for 
Did you mean: 
Community Platform Discussions
Connect with fellow community members to discuss general topics related to the Databricks platform, industry trends, and best practices. Share experiences, ask questions, and foster collaboration within the community.
cancel
Showing results for 
Search instead for 
Did you mean: 

Databricks bundles - good practice for multiprocessing envs

mderela
New Contributor II

I'm seeking advice regarding Databricks bundles. In my scenario, I have multiple production environments where I aim to execute the same DLT. To simplify, let's assume the DLT reads data from 'eventhub-region-name,' with this being the only differing factor. Therefore, I can parameterize a variable and utilize it as a configuration parameter in the DLT definition.

However, I'm unsure about the best practice for targeting. Both environments are production, and I prefer to keep the code simple. Should I define something like 'prod-1' and 'prod-2' as targets, or is there a better approach?

1 REPLY 1

Kaniz_Fatma
Community Manager
Community Manager

Hi @mderela,

When dealing with Databricks bundles in a multi-environment setup, there are some best practices you can follow to ensure smooth execution and maintainable code.

Let’s explore a couple of recommendations:

  1. Parameterization and Configuration:

    • Your approach of parameterizing the differing factor (in this case, the ‘eventhub-region-name’) is a good start. By using a variable, you can easily switch between environments without modifying the core logic of your DLT.
    • Consider defining environment-specific configuration parameters (such as connection strings, credentials, or other settings) in a central location (e.g., a configuration file or a secret store). This way, you can keep the code clean and avoid hardcoding environment-specific details directly in your DLT definition.
  2. Dynamic Cluster Selection:

  3. Service Principals for Production Deployments:

For more detailed information, you can refer to the Databricks documentation on Databricks Asset Bundle deployment modes.2 Additionally, consider exploring best practices for integrating notebooks into production assets, a...3.

 

Connect with Databricks Users in Your Area

Join a Regional User Group to connect with local Databricks users. Events will be happening in your city, and you won’t want to miss the chance to attend and share knowledge.

If there isn’t a group near you, start one and help create a community that brings people together.

Request a New Group