how to pass arguments and variables to databricks python activity from azure data factory
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
07-22-2019 07:45 AM
how to pass arguments and variables to databricks python activity from azure data factory

- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
07-22-2019 08:31 AM
I do something like this...
In DBrix notebook:
dbutils.widgets.text("runDateYYYYMMDD", "")
runDate = dbutils.widgets.get("runDateYYYYMMDD")
In Data Factory:
Pipeline > Parameters tab > New Parameter:
Name: RunDate, Type:String, Default Value:
Data Bricks Notebook Activity > Settings tab > New Base Parameter:
Name=runDateYYYYMMDD, Value=pipeline().parameters.RunDate
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
07-22-2019 11:01 PM
Hi Timkracht,
thanks for your reply.
is there any way to directly to read those parameters without using widgets. what we need is from azure data factory we are triggering python activity as python program file is stored in dbfs, now i need to access those variables in that python program.
what we need is which python files are being executed by which pipeline and activity of azure data factory.
thanks in advance.

- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
07-23-2019 05:44 AM
I'm afraid I do not have experience with that, just passing parameters through widgets in notebooks.
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
05-27-2021 06:43 AM
try importing argv from sys. Then if you have the parameter added correctly in DataFactory you could get it in your python script typing argv[1] (index 0 is the file path).

