05-15-2025 05:19 AM
Hello All,
I have a use case where I want to trigger an Azure Data Factory pipeline through API. Right now I am calling the API in Databricks and using Service Principal(token based) to connect to ADF from Databricks.The ADF pipeline has some parameters which I want to pass via the API.
The issue is when I run the notebook, the ADF pipeline gets triggered, but the parameters get passed as empty values or ''. Below is the code for POST method to trigger the pipeline (some details have been masked for security reasons):
tenant_id = dbutils.secrets.get(scope="adb-dev-secret-scope",key="spn-databricks-dev-eastus2-001-tenantid")
client_id = dbutils.secrets.get(scope="adb-dev-secret-scope",key="spn-databricks-dev-eastus2-001-clientid")
client_secret = dbutils.secrets.get(scope="adb-dev-secret-scope",key="spn-databricks-dev-eastus2-001-secret")
subscription_id = "****"
resource_group = "****"
factory_name = "****"
pipeline_name = "Test Adf from databricks"
api_version = "2018-06-01"
# Optional: parameters to pass to pipeline
parameters = {
'curr_working_user': 'xyz'
}
# === Get token using client secret auth ===
credential = ClientSecretCredential(tenant_id, client_id, client_secret)
token = credential.get_token("https://management.azure.com/.default").token
headers = {
"Authorization": f"Bearer {token}",
"Content-Type": "application/json"
}
body = {
'parameters': f'{parameters}'
}
print(json.dumps(body, indent=1))
response = requests.post(url, headers=headers, json=body)
print(response.status_code)
print(response.json())
I want the parameter "curr_working_user" to be passed as 'xyz' in ADF pipeline. The parameter in ADF is same: curr_working_user.
You help is greatly appreciated!!
Thanks,
Andolina
#azuredatafactory #API #parameters
05-15-2025 07:02 AM
I see the issue in your code. The problem is with how you're constructing the body parameter object.
The Problem
You're formatting the parameters dictionary into a string when creating the body,
which is causing the parameters to be sent incorrectly:
body = {
'parameters': f'{parameters}' # This converts parameters to a string representation
}
When you use f'{parameters}', it converts your dictionary to a string like "{'curr_working_user': 'xyz'}",
which ADF can't parse correctly as a JSON object.
Solution:
Simply pass the parameters dictionary directly without string conversion:
body = {
'parameters': parameters # Pass the dictionary directly
}
Here's the corrected code:
tenant_id = dbutils.secrets.get(scope="adb-dev-secret-scope",key="spn-databricks-dev-eastus2-001-tenantid")
client_id = dbutils.secrets.get(scope="adb-dev-secret-scope",key="spn-databricks-dev-eastus2-001-clientid")
client_secret = dbutils.secrets.get(scope="adb-dev-secret-scope",key="spn-databricks-dev-eastus2-001-secret")
subscription_id = "****"
resource_group = "****"
factory_name = "****"
pipeline_name = "Test Adf from databricks"
api_version = "2018-06-01"
# Parameters to pass to pipeline
parameters = {
'curr_working_user': 'xyz'
}
# Get token using client secret auth
credential = ClientSecretCredential(tenant_id, client_id, client_secret)
token = credential.get_token("https://management.azure.com/.default").token
headers = {
"Authorization": f"Bearer {token}",
"Content-Type": "application/json"
}
# Corrected body construction - pass the dictionary directly
body = {
'parameters': parameters
}
print(json.dumps(body, indent=1))
response = requests.post(url, headers=headers, json=body)
print(response.status_code)
print(response.json())
This will ensure that the parameters are properly serialized as a JSON object within the request body,
and ADF will receive them in the correct format.
If you want to verify the request is formatted correctly, you can add these debug lines:
import json
print("Request URL:", url)
print("Request body:", json.dumps(body, indent=2))
The properly formatted JSON should look like:
{
"parameters": {
"curr_working_user": "xyz"
}
}
05-15-2025 01:37 PM
Hi LR,
Thank you for replying. The suggested way was the first thing I did and when it didn't work, I started trying different approaches. I have tried to pass the parameters both directly and indirectly and still facing the same issue. Right now, I have passed it exactly as you have mentioned.
I have put a Set Variable in ADF to check the parameter value and it is still getting passed as Null.
For the set variable I am passing: @pipeline().parameters.curr_working_user
Kindly let me know if I am missing something.
Thanks,
Andolina
05-15-2025 09:46 PM
Based on your images, I can now see the complete issue more clearly. Your ADF pipeline is receiving a null value for the parameter despite the request showing the parameter is being sent correctly.
Looking at Image 1, your code is correctly formatting the request body:
body = {
"parameters": {
"curr_working_user": "xyz"
}
}
And the API call returns a 200 status code, confirming the pipeline was triggered successfully.
However, in Image 2, we see that the pipeline variable "check_param" is receiving a null value, even though you're trying to assign it the pipeline parameter value.
Here are the most likely causes and solutions:
Issue 1: Parameter name mismatch
There might be a case sensitivity or exact naming issue between your request and the pipeline parameter definition.
Solution:
1. Double-check the exact parameter name in your ADF pipeline definition
2. Ensure the parameter name in your code matches exactly (including case)
Issue 2: Pipeline parameter definition issue
The parameter might not be properly defined or expected in the pipeline.
Solution:
1. Open your pipeline in ADF
2. Go to Parameters tab
3. Verify that "curr_working_user" is defined as a parameter
4. Check its default value (if any)
Issue 3: Pipeline reference syntax
The variable assignment in your pipeline might have incorrect syntax.
Solution:
Verify your pipeline's Set Variable activity is using the correct expression:
@pipeline().parameters.curr_working_user
Issue 4: API endpoint issue
The management API endpoint might not be passing parameters correctly.
Solution: Try using the alternative ADF REST API endpoint format:
body = {
"parameters": {
"curr_working_user": "xyz"
},
"isRecovery": False,
"startActivityName": "" # Optional - for resuming at a specific activity
}
Issue 5: Check if parameter values are visible
In your monitoring view, the "Value" showing null might not represent the actual runtime value.
Solution: Add a logging activity in your pipeline to output the parameter value to a more visible location:
1. Add an Azure Function or Web activity that logs the parameter
2. Add a Copy activity with the parameter in the source or sink name
3. Use Azure Monitor Logs to check the parameter value
Most Important Check:
Verify that your ADF pipeline has a parameter with exactly the name "curr_working_user" under the Parameters tab in the pipeline definition.
If the parameter does exist with the exact same name and is still showing as null,
try running the pipeline directly from the ADF UI with test parameters to confirm it accepts parameters correctly before trying the API approach again.
a month ago
Hi LR,
Thank you for the details. I have checked thoroughly regarding your points.
Issue #1,2,3,5 are fine. I have parameter defined in the pipeline which has exact same name as the databricks param and it is being referenced correctly in the variable. I have validated the pipeline and the activity for the same. I am trying to capture the value of the parameter in the Set Variable and still getting null there for every run.
Regarding Issue #4, I have tried passing the body as you have mentioned. I passed the below as I wanted to trigger the pipeline from the copy activity only to test:
Thursday
Hello All,
Thank you for all your suggestions on this thread. We have raised the issue as a product bug to Microsoft. They did not update us as to why parameters do not work with http request. However they did demo to us that parameters work with python script.
https://learn.microsoft.com/en-us/rest/api/datafactory/pipelines/create-run?view=rest-datafactory-20...
This resolved our problem and I have tested by passing multiple parameters through databricks to ADF and it was taking the parameters correctly.
The document doesn't mention how to write the parameters; you have to write it inside the response. Eg:
##******************code************************************
from azure.identity import DefaultAzureCredential
from azure.mgmt.datafactory import DataFactoryManagementClient
print("Running ADF Pipeline....")
tenant_id = ***
client_id = ***
client_secret = ***
parameters = {
'curr_working_user': f'{current_user}',
'ServerName': f'{servername}',
'SharedFolder': f'{shared_folder}',
'Folder': f'{folder}'
}
def main():
client = DataFactoryManagementClient(
credential=ClientSecretCredential(tenant_id, client_id, client_secret),
subscription_id = "***",
)
response = client.pipelines.create_run(
resource_group_name = "***",
factory_name = "***",
pipeline_name = "Test Adf from databricks",
parameters = parameters or {}
)
print(response)
if __name__ == "__main__":
main()
Passionate about hosting events and connecting people? Help us grow a vibrant local community—sign up today to get started!
Sign Up Now