07-22-2019 07:45 AM
how to pass arguments and variables to databricks python activity from azure data factory
07-22-2019 08:31 AM
I do something like this...
In DBrix notebook:
dbutils.widgets.text("runDateYYYYMMDD", "")
runDate = dbutils.widgets.get("runDateYYYYMMDD")
In Data Factory:
Pipeline > Parameters tab > New Parameter:
Name: RunDate, Type:String, Default Value:
Data Bricks Notebook Activity > Settings tab > New Base Parameter:
Name=runDateYYYYMMDD, Value=pipeline().parameters.RunDate
07-22-2019 11:01 PM
Hi Timkracht,
thanks for your reply.
is there any way to directly to read those parameters without using widgets. what we need is from azure data factory we are triggering python activity as python program file is stored in dbfs, now i need to access those variables in that python program.
what we need is which python files are being executed by which pipeline and activity of azure data factory.
thanks in advance.
07-23-2019 05:44 AM
I'm afraid I do not have experience with that, just passing parameters through widgets in notebooks.
05-27-2021 06:43 AM
try importing argv from sys. Then if you have the parameter added correctly in DataFactory you could get it in your python script typing argv[1] (index 0 is the file path).
Join our fast-growing data practitioner and expert community of 80K+ members, ready to discover, help and collaborate together while making meaningful connections.
Click here to register and join today!
Engage in exciting technical discussions, join a group with your peers and meet our Featured Members.