cancel
Showing results for 
Search instead for 
Did you mean: 
Data Engineering
cancel
Showing results for 
Search instead for 
Did you mean: 

Passing parameters from DevOps Pipeline/release to DataBricks Notebook

SEOCO
New Contributor II

Hi,

This is all a bit new to me.

Does anybody have any idea how to pass a parameter to the Databricks notebook.

I have a DevOps pipeline/release that moves my databricks notebooks towards QA and Production environment. The only problem I am facing is that the DataLake storage location on DEV, QA and PROD are different.

any idea how pass the right datalake storage string/parameter?

Thx

m

3 REPLIES 3

Hubert-Dudek
Esteemed Contributor III

I use separate cluster for dev and production so this way is easy - just use cluster Environment Variables for DataLake storage location:

environment-variables 

SEOCO
New Contributor II

than you for your feedback. I moved it to the KeyVault and it is working.

Anonymous
Not applicable

@Mario Walle​ - If @Hubert Dudek​'s answer solved the issue, would you be happy to mark his answer as best so that it will be more visible to other members?

Welcome to Databricks Community: Lets learn, network and celebrate together

Join our fast-growing data practitioner and expert community of 80K+ members, ready to discover, help and collaborate together while making meaningful connections. 

Click here to register and join today! 

Engage in exciting technical discussions, join a group with your peers and meet our Featured Members.