cancel
Showing results for 
Search instead for 
Did you mean: 
Warehousing & Analytics
cancel
Showing results for 
Search instead for 
Did you mean: 

Misleading UNBOUND_SQL_PARAMETER even though parameter specified

jhrcek
New Contributor II

Hello. Please forgive me if this is not the right place to ask, but I'm having issues with databricks' statement execution api. I'm developing Haskell client for this api.
I managed to implement most of it, but I'm running into issues with using named parameters to interpolate values into queries.
The documentation says that I can use named parameter markers in combination with IDENTIFIER to interpolate table names into queries.

This is however not working in all cases. Specifically I'm getting a very misleading error in the following case, when I'm interpolating table identifier in an INSERT statement. I'm running the following request (using linux `curl` command to illustrate the issue)

curl --request POST \
https://WORKSPACE_ID_REDACTED.cloud.databricks.com/api/2.0/sql/statements/ \
--header "Authorization: Bearer DATABRICKS_TOKEN_REDACTED" \
--header "Content-Type: application/json" \
--data '{
"warehouse_id": "WAREHOUSE_ID_REDACTED",
"statement": "INSERT INTO IDENTIFIER(:mytable) VALUES (:name, :age)",
"parameters": [{"name": "name", "value": "jan"},{"name": "age", "value": "10"},{"name": "mytable", "value": "z_jan_hrcek.test.person"}]
}'


Actual behavior:

I get an error response like this:

{"statement_id":"01eed9f9-13c6-1abf-ad15-22647b4a619c","status":{"state":"FAILED","error":{"error_code":"BAD_REQUEST","message":"[UNBOUND_SQL_PARAMETER] Found the unbound parameter: name. Please, fix `args` and provide a mapping of the parameter to a SQL literal.; line 1 pos 41"}}}

This is misleading, because it states I'm not providing `name` parameter, even though you can see I am actually providing the value for this parameter in the request.

Expected behavior:
- all supplied parameters should be interpolated into the query
OR
* better error message to make it clear what the issue is (is it that interpolating table name into INSERT queries is not supported?)

Interestingly, if I hard-code the table name into the query, everything works as expected:

curl --request POST \
https://dbc-6ce34a66-aae7.cloud.databricks.com/api/2.0/sql/statements/ \
--header "Authorization: Bearer dapic78c5812b28d77c2c768c4520cf6c7c8" \
--header "Content-Type: application/json" \
--data '{
"warehouse_id": "77d7c1940bb93e75",
"statement": "INSERT INTO z_jan_hrcek.test.person VALUES (:name, :age)",
"parameters": [{"name": "name", "value": "jan"},{"name": "age", "value": "10"}]
}'

# ^ This succeeds, and 1 row is insterted into the table


But for my use case I need the ability to specify table name dynamically and ideally use parameter interpolation to help me prevent SQL injection.

4 REPLIES 4

Kaniz
Community Manager
Community Manager

Hi @jhrcek, Let’s dive into the issue you’re facing with Databricks’ Statement Execution API and named parameters.

  1. Named Parameters and Table Identifiers:

    • Named parameter markers are a powerful feature for supplying values to SQL statements. They help prevent SQL injection attacks by clearly separating provided values from the SQL code.
    • You mentioned that you’re trying to interpolate table names using named parameters in an INSERT statement. This is a common use case, especially when dealing with dynamic table names.
    • However, it seems you’re encountering an issue where the named parameters are not being correctly bound in your query.
  2. The Error Message:

    • The error message you received is indeed misleading. Let’s break it down:
      • UNBOUND_SQL_PARAMETER: This indicates that a parameter in your SQL statement is not bound (i.e., its value is not provided).
      • Found the unbound parameter: name: It suggests that the name parameter is not being properly bound.
      • The error message points to line 1 pos 41, which is where the issue supposedly occurs.
    • However, you’ve clearly provided the value for the name parameter in your request, so something else might be going wrong.
  3. Troubleshooting Steps:

    • To troubleshoot this issue, consider the following steps:
      • Check Parameter Names: Ensure that the parameter names in your SQL statement (:name, :age, and :mytable) match the names you’re providing in the parameters section of your request.
      • Quotes Around IDENTIFIER: When using IDENTIFIER, make sure you’re enclosing the table name in double quotes (if needed). For example: INSERT INTO IDENTIFIER("z_jan_hrcek.test.person").
      • API Version Compatibility: Verify that you’re using the correct version of the Databricks API (2.0 in your case).
      • Permissions and Warehouse Configuration: Ensure that your Databricks user has the necessary permissions to execute the statement and that the specified warehouse is correctly configured.
      • Try Unnamed Parameters: As a test, try using unnamed parameters (e.g., ?) instead of named parameters. If this works, it might help narrow down the issue.
  4. Hard-Coded Table Name:

    • You mentioned that everything works as expected when you hard-code the table name into the query. This suggests that the issue lies specifically with parameter binding.
    • While hard coding works, it’s not a scalable solution for dynamic table names.
  5. Conclusion:

    • To summarize, ensure that your parameter names match, check the IDENTIFIER usage, and explore unnamed parameters. 
    • Remember that you’re not alone—many developers face similar challenges when dealing with dynamic SQL queries and parameter binding.

Feel free to share more details or ask for additional help if needed! 🚀

jhrcek
New Contributor II

Thanks for the reply, but I don't find this (LLM generated?) reply very helpful 😢

None of the troubleshooting tips helps:

- Check Parameter Names: don't you see the example code? You can check for yourself that names of params in the query correspond to the names of args in parameters list
- Quotes Around IDENTIFIER: The identifier has to contain the table name placeholder. It can't be quoted, otherwise parameter interpolation doesn't work.
- API Version Compatibility: the api version IS in the URL /2.0/ as you can see in my examples
- Permissions and Warehouse Configuration: This can't be the problem, because everything works when I hardcode the table
- Try Unnamed Parameters: unnamed parameters are not supported in statement execution api based on the documentation (https://docs.databricks.com/api/workspace/statementexecution/executestatement#parameters😞
currently, positional parameters denoted by a ? marker are not supported by the Databricks SQL Statement Execution API.

What would be more helpful is if you could point me to some github repo/issue tracker where I can report this issue as it seems to be a problem with how the api backend is implemented.

Kaniz
Community Manager
Community Manager

Hi @jhrcek, I understand your frustration, and I apologize if my previous reply didn't address your concerns adequately. It seems you've already explored several troubleshooting tips without success.

To address the issue you're facing with the Databricks SQL Statement Execution API, I'll look into finding a suitable GitHub repository or issue tracker where you can report this problem. I'll get back to you as soon as I have more information.

Thank you for bringing this to our attention, and I appreciate your patience.

jhrcek
New Contributor II

Ok, I thing I found what's behind this.
This seems to be a bug in spark itself.
Here's how I was able to reproduce it in spark-shell running in docker container:

https://gist.github.com/jhrcek/49386004a9a47172649158af288106f4

I think it's best if I just report it in spark issue tracker.

Welcome to Databricks Community: Lets learn, network and celebrate together

Join our fast-growing data practitioner and expert community of 80K+ members, ready to discover, help and collaborate together while making meaningful connections. 

Click here to register and join today! 

Engage in exciting technical discussions, join a group with your peers and meet our Featured Members.