cancel
Showing results forย 
Search instead forย 
Did you mean:ย 
Data Engineering
Join discussions on data engineering best practices, architectures, and optimization strategies within the Databricks Community. Exchange insights and solutions with fellow data engineers.
cancel
Showing results forย 
Search instead forย 
Did you mean:ย 

Calling a function with parameters via Spark ODBC driver

CHorton
Visitor

Hi All,

I am having an issue with calling a Databricks SQL user defined function with parameters from my client application using the Spark ODBC driver.

I have been able to execute a straight SQL statement using parameters e.g. 

SELECT * FROM Customer WHERE CustomerId = ?

But if I try to execute a statement that contains a parameterised function it will not work e.g.

SELECT * FROM GetCustomerData(?)

I always get an [UNBOUND_SQL_PARAMETER] error.  I suspect that Databricks needs named parameter binding, but ODBC doesn't support that, so I am a bit stuck.

Has anyone come across this issue before, or have a solution?

Thanks,

Chris.

2 REPLIES 2

mukul1409
Visitor

hi @CHorton 
This is a known limitation when calling Databricks SQL user defined functions with parameters through the Spark ODBC driver. Positional parameters using question marks are supported in standard SQL statements, but they are not supported when used inside function calls, which is why the unbound SQL parameter error occurs. Databricks SQL currently expects parameters to be resolved before function evaluation and does not support parameter binding for UDF arguments through ODBC. A common workaround is to avoid parameterized function calls and instead rewrite the logic inline in the SQL query or use a view or table valued function that does not require runtime parameters. Another option is to construct the SQL string dynamically in the client application with the parameter value substituted, taking care to handle escaping safely.

Mukul Chauhan

iyashk-DB
Databricks Employee
Databricks Employee

Hi @CHorton 
The Databricks SQL engine does not support positional (?) parameters inside SQL UDF calls. 

When Spark SQL parses GetCustomerData(?), the parameter is unresolved at analysis time, so you get [UNBOUND_SQL_PARAMETER]. This is not an ODBC bug, itโ€™s a Spark/Databricks SQL limitation. Parameters work only in expressions like WHERE, JOIN, etc.

you can use Stored procedure which supports parameters and work correctly from ODBC.
1) Create a Stored procedure:

CREATE OR REPLACE PROCEDURE GetCustomerDataSP (cust_id INT)
RETURNS TABLE
LANGUAGE SQL
AS
BEGIN
RETURN
SELECT *
FROM Customer
WHERE CustomerId = cust_id;
END;

2) Call it from SQL

CALL GetCustomerDataSP(123);

or Call it via ODBC (parameterized)

CALL GetCustomerDataSP(?);