Passing parameters from DevOps Pipeline/release to DataBricks Notebook
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
โ11-03-2021 04:01 AM
Hi,
This is all a bit new to me.
Does anybody have any idea how to pass a parameter to the Databricks notebook.
I have a DevOps pipeline/release that moves my databricks notebooks towards QA and Production environment. The only problem I am facing is that the DataLake storage location on DEV, QA and PROD are different.
any idea how pass the right datalake storage string/parameter?
Thx
m
- Labels:
-
Databricks notebook
-
Devops
-
Parameters
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
โ11-03-2021 05:18 AM
I use separate cluster for dev and production so this way is easy - just use cluster Environment Variables for DataLake storage location:
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
โ11-08-2021 04:47 AM
than you for your feedback. I moved it to the KeyVault and it is working.

- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
โ11-08-2021 08:35 AM
@Mario Walleโ - If @Hubert Dudekโ's answer solved the issue, would you be happy to mark his answer as best so that it will be more visible to other members?

