Hi @cuhlmann ,
As I understand you need to ingest data into Azure Databricks from external systems, and your customer requires using client certificate authentication. The challenge is that the client certificate is stored in Azure Key Vault, but the libraries you're using (Requests, Aiohttp, Solace Broker API) require a local certificate file for client authentication.
Here is the possible solution:
- Create a key vault-backed secret scope in Databricks
- Access secrets in the notebook like this:
# Accessing secrets from Key Vault
client_cert = dbutils.secrets.get(scope="your_scope", key="client_certificate")
client_key = dbutils.secrets.get(scope="your_scope", key="client_key")
- Out of the secrets create a certificate file and save to the location (Volume when using Unity Catalog, DBFS tmp without Unity Catalog)
- Use the certificate file in your code
import requests
url = "https://external-system.example.com/api/data"
# Use the certificate file for client authentication
response = requests.get(url, cert=cert_file_path)
# Check the response
print(response.status_code)
print(response.text)