cancel
Showing results for 
Search instead for 
Did you mean: 
Data Engineering
Join discussions on data engineering best practices, architectures, and optimization strategies within the Databricks Community. Exchange insights and solutions with fellow data engineers.
cancel
Showing results for 
Search instead for 
Did you mean: 

I am Getting SSLError(SSLEOFError) error while triggering Azure DevOps pipeline from Databricks

sparmar
New Contributor

While triggering Azure devOps pipleline from Databricks, I am getting below error:

An error occurred: HTTPSConnectionPool(host='dev.azure.com', port=443): Max retries exceeded with url: /XXX-devops/XXXDevOps/_apis/pipelines/20250224.1/runs?api-version=7.1-preview.1 (Caused by SSLError(SSLEOFError(8, 'EOF occurred in violation of protocol (_ssl.c:1147)'))).

I have confirmed that there is no firewall issue and no certificate issue. May I know why I am getting this issue with below code?

Below is my code:

 

import requests
import json
from requests.auth import HTTPBasicAuth

# Azure DevOps organization and project details
organization = "azdevops"
project = "DevOps"

# Pipeline ID
pipeline_id = "20250224.1"

# Personal Access Token (PAT) - Ensure this is securely managed
pat = "XXX5cfJQRSanltGdfWAma6iPSk83XsdfsdfsadfsdfeAsgbUZz0pPa21JIDk5R6tLATeXXX"

# REST API endpoint for triggering a pipeline
url = f"https://dev.azure.com/{organization}/{project}/_apis/pipelines/{pipeline_id}/runs?api-version=7.1-preview.1"

# Headers for authentication
headers = {
    "Authorization": f"Basic {(':' + pat).encode('utf-8').hex()}",
    "Content-Type": "application/json"
}

# Optional request body (e.g., to pass parameters to the pipeline)
data = {
  "resources": {
    "repositories": {
      "self": {
        "refName": "refs/heads/main"
      }
    }
  }
}

try:
    #response = requests.post(url, headers=headers, data=json.dumps(data), auth=HTTPBasicAuth("", pat))
    response = requests.post(url, headers=headers, data=json.dumps(data))
    response.raise_for_status()  # Raise HTTPError for bad responses (4xx or 5xx)
   
    # Process the response
    if response.status_code == 200:
        pipeline_run_details = response.json()
        print("Pipeline triggered successfully!")
        print(f"Pipeline run URL: {pipeline_run_details['_links']['web']['href']}")
    else:
      print(f"Failed to trigger pipeline. Status code: {response.status_code}")
      print(f"Response body: {response.text}")

except requests.exceptions.RequestException as e:
    print(f"An error occurred: {e}")
1 REPLY 1

mark_ott
Databricks Employee
Databricks Employee

The error you’re seeing (SSLEOFError(8, 'EOF occurred in violation of protocol (_ssl.c:1147)')) while triggering the Azure DevOps pipeline from Databricks indicates an issue with the SSL/TLS handshake, not the firewall or certificate itself. This is often related to the way the authorization header is being constructed and encoded, or sometimes due to incorrect endpoint or network issues between Databricks (hosted in Azure) and Azure DevOps API.

Common Causes & Resolutions

1. Authorization Header Format

Your code’s authorization header is using hex encoding, which is not the standard for HTTP Basic Authorization. The correct encoding should be Base64.

Replace this:

python
"Authorization": f"Basic {(':' + pat).encode('utf-8').hex()}",

With this (Base64 encoding):

python
import base64 token = base64.b64encode(f":{pat}".encode()).decode() headers = { "Authorization": f"Basic {token}", "Content-Type": "application/json" }

The HTTP Basic Auth header should always be base64 and in the form username:password, here username is empty, and password is your PAT.

2. Using the Auth Argument

Alternatively, you should use the built-in HTTPBasicAuth class for the requests library, which automatically handles proper header construction. Your commented-out code is correct:

python
response = requests.post(url, headers=headers, data=json.dumps(data), auth=HTTPBasicAuth("", pat))

If you use auth=HTTPBasicAuth("", pat), you don’t need to manually construct the Authorization header — let requests handle it.

3. Pipeline ID Format

The pipeline_id should be a number (example: pipeline_id = 12), not a build run number or string like 20250224.1—use the integer pipeline ID, not the run number that appears in the pipeline URL after it starts.

4. Endpoint URL Structure

Your URL structure may be incorrect. It should be:

Make sure organization and project do not contain special characters or spaces, and that pipeline_id is the actual pipeline resource ID.

5. TLS/SSL on Databricks

If you’re running this from Databricks notebooks, make sure the cluster supports TLS 1.2 and has up-to-date root certificates. Sometimes, upgrading the Databricks runtime resolves SSL issues.

Sample Corrected Code

python
import requests import json from requests.auth import HTTPBasicAuth organization = "azdevops" project = "DevOps" pipeline_id = 12 # Use correct pipeline resource ID, NOT build/run number! pat = "your_PAT_here" url = f"https://dev.azure.com/{organization}/{project}/_apis/pipelines/{pipeline_id}/runs?api-version=7.1-preview.1" headers = {"Content-Type": "application/json"} data = { "resources": { "repositories": { "self": {"refName": "refs/heads/main"} } } } try: response = requests.post(url, headers=headers, data=json.dumps(data), auth=HTTPBasicAuth("", pat)) response.raise_for_status() if response.status_code == 200: pipeline_run_details = response.json() print("Pipeline triggered successfully!") print(f"Pipeline run URL: {pipeline_run_details['_links']['web']['href']}") else: print(f"Failed to trigger pipeline. Status code: {response.status_code}") print(f"Response body: {response.text}") except requests.exceptions.RequestException as e: print(f"An error occurred: {e}")

Key Checklist

  • PAT is correct and has required permissions for pipeline execution.

  • Pipeline ID is the numeric resource ID, not a run/build number.

  • Authorization header uses base64 or HTTPBasicAuth, not hex.

  • URL matches Azure DevOps pipeline trigger format.

  • Databricks cluster runtime is updated and supports TLS 1.2.

If you still see the SSL error after these corrections, check the network policies in Azure (NSG, private endpoints), and consider running the code outside Databricks to rule out platform-specific issues.