cancel
Showing results for 
Search instead for 
Did you mean: 
Administration & Architecture
Explore discussions on Databricks administration, deployment strategies, and architectural best practices. Connect with administrators and architects to optimize your Databricks environment for performance, scalability, and security.
cancel
Showing results for 
Search instead for 
Did you mean: 

Looking for experiences with DABS CLI Deployment, Terraform and Security

FabianGutierrez
Contributor

Hi Community,

I hope my topic finds you well. Within our Databricks landscape we decided to use DABS (Databricks Asset Bundles) however we found out (the hard way) that it uses Terraform for Deployment purposes. This is a concern now for Security and Architects specially on the State file where sensitive information (keys). Architects as us to provide a solution where the .tfstate file is isolated from any user even Admins. 
We are looking for ways to isolate (no permissions) so that it can only be read with elevated permissions. 
Also, the State file does not deployed to the target Workspace but other files do (deployment.json | metadata.json).
We provide a separate path within our YAML file and still no presence of the state file.
The architects also said that as an alternative to an isolated place for the state file, its good if we can provide a Audit logging showing who and when has accessed the state file, but we are not aware of this feature within Unity Catalog.

If you recognized this scenario, have had some experience on this subject or similar, please share it.

Any information is more then welcome.
Thanks in advanced.
Regards, Fabian

 

1 REPLY 1

hari-prasad
Valued Contributor

Hi @FabianGutierrez ,

As I closely observed Databricks Asset Bundle is leveraging terraform under the hood, but it's not generating terraform.tfstate file. And more over the `.databricks` is gitignored which will not be synced to remote repo.

Additionally you can gitignore databricks.yml file, in turn users can create their own databricks.yml with their account details and run databricks bundle validate --profile <profilename> to dynamically generate `.databricks` folder which is default ignored in gitignored.

Finally, for production environment you can create a service account which can used in CICD to deploy your databricks using asset bundle with custom databricks.yml file with service account details.

Regards,
Hari Prasad



Regards,
Hari Prasad

Connect with Databricks Users in Your Area

Join a Regional User Group to connect with local Databricks users. Events will be happening in your city, and you won’t want to miss the chance to attend and share knowledge.

If there isn’t a group near you, start one and help create a community that brings people together.

Request a New Group