@Ruben Sanchez :
Yes, you can pass custom parameters to a Delta Live Table pipeline when calling it from Azure Data Factory using the REST API. One way to achieve this is by adding the custom parameters to the body of the API call as a JSON object. You can then retrieve these custom parameters dynamically from the running notebook attached to the pipeline.
Here is an example of how you can pass custom parameters to a Delta Live Table pipeline using the REST API:
- In your ADF pipeline, add an HTTP activity to call the REST API endpoint for the Delta Live Table pipeline. In the body of the API call, include a JSON object with your custom parameters, like so:
{
"fullRefresh": true,
"customParameter1": "value1",
"customParameter2": "value2"
}
2) In your Delta Live Table pipeline, retrieve the custom parameters from the body of the API call using the dbutils.widgets.get method. This method retrieves the value of a widget parameter that was passed to the notebook, which in this case is the custom parameters object.
import json
# Retrieve the custom parameters from the notebook widgets
custom_params_json = dbutils.widgets.get("custom_params")
custom_params = json.loads(custom_params_json)
# Retrieve the values of the custom parameters
custom_parameter1 = custom_params.get("customParameter1")
custom_parameter2 = custom_params.get("customParameter2")
# Use the custom parameters in your pipeline logic
...
In this example, we first retrieve the custom parameters object from the notebook widget using the
dbutils.widgets.get method. We then parse the JSON string into a Python dictionary using the
json.loads method. Finally, we retrieve the values of the custom parameters using the get method on the dictionary, and use them in our pipeline logic.
Note that the custom parameter keys should be unique and not conflict with any reserved keywords used by the Delta Live Table pipeline, such as "fullRefresh" in the example above.
I hope this helps! Let me know if you have any further questions.