cancel
Showing results for 
Search instead for 
Did you mean: 
Data Engineering
Join discussions on data engineering best practices, architectures, and optimization strategies within the Databricks Community. Exchange insights and solutions with fellow data engineers.
cancel
Showing results for 
Search instead for 
Did you mean: 

Unity Catalog Python UDF to Send Messages to MS Teams

SQLBob
New Contributor II

Good Morning All - 

This didn't seem like such a daunting task until I tried it. Of course, it's my very first function in Unity Catalog. 

Attached are images of both the UDF and example usage I created to send messages via the Python requests library within the UDF. I'll see if I can paste in the code below. 

The UDF is supplied with a URL and payload (JSON - Adaptative card for MS Teams). It should send the payload as a Teams message and works as intended when defined internally within a Python notebook. But when defined within UC and executed from a notebook, it sends no message and returns no errors.

I would appreciate it if one of you experts would eyeball this thing and pull me out of the weeds. I can't believe I'm the first one to try this although hours spent researching it has only resulted in references to defining the processing internally within the notebook.

====================  UDF ===========================

CREATE OR REPLACE FUNCTION mycatalog.myschema.udf_send_post_request(
url STRING,
payload STRING
)
RETURNS STRING
LANGUAGE PYTHON
AS
$$

import requests
import json

def udf_send_post_request(url, payload):
try:
# Convert payload from string to JSON
payload_json = json.loads(payload)
response = requests.post(url, json=payload_json)
response.raise_for_status() # Raise an error for bad status codes
return response.text
except requests.exceptions.RequestException as e:
return f"Error: {e}"

$$;

====================  USAGE ===========================

# Set the catalog and schema
spark.sql("USE CATALOG mycatalog")
spark.sql("USE SCHEMA myschema")


# Example usage
url = "myURL"

payload = {
"type": "AdaptiveCard",
"attachments": [
{
"contentType": "application/vnd.microsoft.card.adaptive",
"content": {
"$schema": "http://adaptivecards.io/schemas/adaptive-card.json",
"type": "AdaptiveCard",
"version": "1.2",
"body": [
{
"type": "TextBlock",
"text": "**TEST EXTERNAL FUNCTION**",
}
]
}
}
]
}

payload_str = json.dumps(payload)

# Create a DataFrame with the URL and payload
df = spark.createDataFrame([(url, payload_str)], ["url", "payload"])


# Use the function in a SQL query
df.createOrReplaceTempView("temp_table")
result = spark.sql(f"""SELECT udf_send_post_request(url, payload) AS response FROM temp_table""")
result.show()

 

Thank You for Any & All Input,

Bob

2 REPLIES 2

SQLBob
New Contributor II

This has been dropped in favor of using a function defined internally within a notebook. If anyone has occasion to get a similar process set up - please let me know.

Thanks

mark_ott
Databricks Employee
Databricks Employee

You're encountering a common limitation when trying to use an external HTTP request (like the Python requests library) inside a Unity Catalog UDF in Databricks. While your code is correct for a regular notebook environment, Unity Catalog UDFs (and, similarly, Spark SQL UDFs running in distributed compute) run in a restricted, sandboxed environment where outbound HTTP calls are not allowed for security reasons.

Why It Works in Notebook But Not in UDF

  • Notebooks: Full access to Python environment, packages, and outbound internet.

  • Unity Catalog Functions: Sandboxed, limited. No network access, no additional installs, can only use a subset of Python's standard library.

This is by design, because otherwise user-defined SQL functions could be used to exfiltrate data or trigger security vulnerabilities.


What Are Your Options?

Use Notebook Code for HTTP Calls

  • Perform the HTTP call from your notebook code (or Python UDF).

  • If the workflow requires SQL, submit results into a table and consume from there.

Use External Functions (Databricks Feature)

Databricks External Functions allow you to call external HTTP endpoints securely from SQL. These are designed for invoking webhooks or external APIs from SQL, similar to what you're after, and are most often integrated with Unity Catalog.

Basic example:

sql
CREATE EXTERNAL FUNCTION send_teams_message(url STRING, payload STRING) RETURNS STRING USING REQUEST_URL '<your-azure-function-or-api-endpoint>';

You need a secure HTTP endpoint (Azure Function, AWS Lambda, REST API you manage) to receive requests from Databricks.

Why "No Errors"?

  • Unity Catalog UDFs may fail silently for forbidden or unimplemented features.

  • If outbound HTTP is restricted, requests.post() simply does not run, and you may get None or an empty response, with no error bubbling to the surface.


Summary Table

Environment Can Do HTTP Requests? Example Approach
Notebook Python Yes Use requests or similar
Python UDF No (in UC/SparkSQL) Not supported
Unity Catalog UDF No Not supported
External Function Yes Use Databricks External Function
 
 

What To Do Next

  • Refactor: Move the message send logic out of the UC UDF.

  • Use Databricks External Functions if you need SQL access to HTTP APIs.

  • If working in notebooks, use Python code directly where possible.

Join Us as a Local Community Builder!

Passionate about hosting events and connecting people? Help us grow a vibrant local community—sign up today to get started!

Sign Up Now