The error you’re seeing (SSLEOFError(8, 'EOF occurred in violation of protocol (_ssl.c:1147)')) while triggering the Azure DevOps pipeline from Databricks indicates an issue with the SSL/TLS handshake, not the firewall or certificate itself. This is often related to the way the authorization header is being constructed and encoded, or sometimes due to incorrect endpoint or network issues between Databricks (hosted in Azure) and Azure DevOps API.
Common Causes & Resolutions
1. Authorization Header Format
Your code’s authorization header is using hex encoding, which is not the standard for HTTP Basic Authorization. The correct encoding should be Base64.
Replace this:
"Authorization": f"Basic {(':' + pat).encode('utf-8').hex()}",
With this (Base64 encoding):
import base64
token = base64.b64encode(f":{pat}".encode()).decode()
headers = {
"Authorization": f"Basic {token}",
"Content-Type": "application/json"
}
The HTTP Basic Auth header should always be base64 and in the form username:password, here username is empty, and password is your PAT.
2. Using the Auth Argument
Alternatively, you should use the built-in HTTPBasicAuth class for the requests library, which automatically handles proper header construction. Your commented-out code is correct:
response = requests.post(url, headers=headers, data=json.dumps(data), auth=HTTPBasicAuth("", pat))
If you use auth=HTTPBasicAuth("", pat), you don’t need to manually construct the Authorization header — let requests handle it.
3. Pipeline ID Format
The pipeline_id should be a number (example: pipeline_id = 12), not a build run number or string like 20250224.1—use the integer pipeline ID, not the run number that appears in the pipeline URL after it starts.
4. Endpoint URL Structure
Your URL structure may be incorrect. It should be:
Make sure organization and project do not contain special characters or spaces, and that pipeline_id is the actual pipeline resource ID.
5. TLS/SSL on Databricks
If you’re running this from Databricks notebooks, make sure the cluster supports TLS 1.2 and has up-to-date root certificates. Sometimes, upgrading the Databricks runtime resolves SSL issues.
Sample Corrected Code
import requests
import json
from requests.auth import HTTPBasicAuth
organization = "azdevops"
project = "DevOps"
pipeline_id = 12 # Use correct pipeline resource ID, NOT build/run number!
pat = "your_PAT_here"
url = f"https://dev.azure.com/{organization}/{project}/_apis/pipelines/{pipeline_id}/runs?api-version=7.1-preview.1"
headers = {"Content-Type": "application/json"}
data = {
"resources": {
"repositories": {
"self": {"refName": "refs/heads/main"}
}
}
}
try:
response = requests.post(url, headers=headers, data=json.dumps(data), auth=HTTPBasicAuth("", pat))
response.raise_for_status()
if response.status_code == 200:
pipeline_run_details = response.json()
print("Pipeline triggered successfully!")
print(f"Pipeline run URL: {pipeline_run_details['_links']['web']['href']}")
else:
print(f"Failed to trigger pipeline. Status code: {response.status_code}")
print(f"Response body: {response.text}")
except requests.exceptions.RequestException as e:
print(f"An error occurred: {e}")
Key Checklist
-
PAT is correct and has required permissions for pipeline execution.
-
Pipeline ID is the numeric resource ID, not a run/build number.
-
Authorization header uses base64 or HTTPBasicAuth, not hex.
-
URL matches Azure DevOps pipeline trigger format.
-
Databricks cluster runtime is updated and supports TLS 1.2.
If you still see the SSL error after these corrections, check the network policies in Azure (NSG, private endpoints), and consider running the code outside Databricks to rule out platform-specific issues.