cancel
Showing results for 
Search instead for 
Did you mean: 
Warehousing & Analytics
Engage in discussions on data warehousing, analytics, and BI solutions within the Databricks Community. Share insights, tips, and best practices for leveraging data for informed decision-making.
cancel
Showing results for 
Search instead for 
Did you mean: 

Using Parameters in EXECUTE IMMEDIATE on Databricks SQL 2025.15 not working

iamgoce
New Contributor III

Hi everyone,

We are having an issue using parameters with EXECUTE IMMEDIATE statements, when running on SQL serverless running DBSQL v2025.15 (currently in preview channel):
declare or replace var query = "SELECT :PARAMETER_1";
EXECUTE IMMEDIATE query;
[UNBOUND_SQL_PARAMETER] Found the unbound parameter: PARAMETER_1. Please, fix `args` and provide a ma...
See below for screenshots of issue. There's a workaround for v2025.15, to use ${PARAMETER_1} marker syntax but that is being deprecated in the future.

iamgoce_0-1744379589689.png
iamgoce_0-1744379939492.png

 

1 ACCEPTED SOLUTION

Accepted Solutions

BigRoux
Databricks Employee
Databricks Employee

Did a little more digging and have further information that might be helpful:

The behavior you're observing with `${PARAMETER_1}` and `:PARAMETER_1` in Databricks SQL is tied to differences in how Databricks Runtime versions handle parameterization and syntax. Here's a detailed explanation:

Why `${PARAMETER_1}` Works in v2025.15
1. String Interpolation Behavior: The `${PARAMETER_1}` syntax is interpreted as a string interpolation mechanism, where the value of the parameter (defined via a widget or variable in the notebook) is substituted directly into the query string before execution. This approach doesn't rely on formal parameter binding but instead performs a textual substitution at runtime.
2. Backward Compatibility: Databricks has historically supported `${}` syntax for parameter substitution, especially in notebooks. This syntax is less strict than named parameter markers (`:PARAMETER_1`) and doesn't require explicit binding via APIs or widgets, making it more user-friendly for interactive workflows.
3. Execution Context: In your example:
```sql
declare or replace var query = "SELECT '${PARAMETER_1}'";
EXECUTE IMMEDIATE query;
```
The value of `${PARAMETER_1}` is substituted into the query string before `EXECUTE IMMEDIATE` runs it, avoiding unbound parameter errors.

 

Why `:PARAMETER_1` Worked in v2025.10
1. **Named Parameter Marker Syntax**: The `:PARAMETER_1` syntax represents a named parameter marker that requires explicit binding of a value before execution. In earlier versions like v2025.10, this behavior might have been supported implicitly in certain contexts (e.g., workflows or API calls) where the parameter was automatically bound to a widget or variable.
2. Potential Unintended Behavior: If `:PARAMETER_1` worked without explicit binding, it could have been due to leniency or an oversight in the runtime's implementation rather than a deliberate feature. Such behavior might not be consistent across all scenarios or runtime versions.

 

Changes Between v2025.10 and v2025.15
1. Stricter Enforcement of Parameter Binding: Starting with newer Databricks Runtime versions (e.g., v2025.15), there has been a shift toward stricter enforcement of named parameter marker syntax (`:PARAMETER_1`). This aligns with best practices for SQL injection prevention and formal parameterization.
2. Deprecation of Implicit Binding: If `:PARAMETER_1` worked without explicit binding in v2025.10, this was likely unintended and has since been addressed to require proper mapping of parameters to values.

 

Should You Avoid Using `:PARAMETER_1`?
Yes, you should avoid relying on `:PARAMETER_1` without explicit binding because:
- It may not work consistently across different runtime versions or contexts (e.g., workflows vs. interactive notebooks).
- Databricks documentation strongly recommends using named parameter markers only when explicitly bound via APIs, widgets, or other mechanisms.
- The `${}` syntax remains more reliable for interactive workflows where direct substitution is sufficient.

 

Best Practices Moving Forward
To ensure compatibility and avoid runtime errors:
- Use `${PARAMETER}` for simple string interpolation in notebooks and workflows where values are defined via widgets or variables.
- Use `:PARAMETER_NAME` only when explicitly binding parameters through APIs or structured workflows.
- Always test queries on the target Databricks Runtime version to confirm expected behavior.

 

View solution in original post

Whenever someone comes from a heavy SQL background, they find some of their favourite toys missing from the Spark SQL world - but there have been huge investments from Databricks to catch up here over the past few months. We thought it was time to talk SQL Variables, Parameterised Tables and even
4 REPLIES 4

BigRoux
Databricks Employee
Databricks Employee
Try this
 
The SQL script snippet provided (DECLARE OR REPLACE var query = "SELECT :PARAMETER_1"; EXECUTE IMMEDIATE query;) will not work as intended in Databricks SQL because it uses a placeholder :PARAMETER_1 without properly assigning a value to it or using a parameterized query format compatible with Databricks SQL.
The error most likely corresponds to unbound SQL parameters, meaning the declared parameter (:PARAMETER_1) is not bound to a concrete value at execution time. Also, Databricks SQL doesn't support bound parameter syntax like PostgreSQL or Oracle.
To address this issue, you should:
  1. Directly substitute the desired value into the query string rather than using unbound parameters.
  2. Avoid usage of EXECUTE IMMEDIATE unless required in a procedural-like SQL flow. Typically, you would write queries in standard SQL directly.
Here is a working structure for SQL in Databricks for a valid query:
DECLARE my_param STRING;
SET my_param = 'desired_value';
SELECT ${my_param};
This approach ensures that the value is dynamically assigned and incorporated into the SQL query.
The error documentation from Azure Databricks confirms the need for properly binding or removing unbound parameters in SQL queries.
 
Cheers, Louis.

iamgoce
New Contributor III

Hi Louis,

Thanks for your reply and explanation.

Note that we do assign the parameter a value when executing the script inside a workflow job. In the sample notebook I also defined the parameter using the widget and assigned it a value directly.

I'm trying to then understand why the following code works in v2025.15:

declare or replace var query = "SELECT '${PARAMETER_1}'";
EXECUTE IMMEDIATE query;

iamgoce_0-1744382255282.png

Isn't ${PARAMETER_1} also unbound and should throw the same error?

And using :PARAMETER_1 is working fine in v2025.10. Was that an unintended behaviour and we should stay away from using it this way?

Thanks

BigRoux
Databricks Employee
Databricks Employee

Did a little more digging and have further information that might be helpful:

The behavior you're observing with `${PARAMETER_1}` and `:PARAMETER_1` in Databricks SQL is tied to differences in how Databricks Runtime versions handle parameterization and syntax. Here's a detailed explanation:

Why `${PARAMETER_1}` Works in v2025.15
1. String Interpolation Behavior: The `${PARAMETER_1}` syntax is interpreted as a string interpolation mechanism, where the value of the parameter (defined via a widget or variable in the notebook) is substituted directly into the query string before execution. This approach doesn't rely on formal parameter binding but instead performs a textual substitution at runtime.
2. Backward Compatibility: Databricks has historically supported `${}` syntax for parameter substitution, especially in notebooks. This syntax is less strict than named parameter markers (`:PARAMETER_1`) and doesn't require explicit binding via APIs or widgets, making it more user-friendly for interactive workflows.
3. Execution Context: In your example:
```sql
declare or replace var query = "SELECT '${PARAMETER_1}'";
EXECUTE IMMEDIATE query;
```
The value of `${PARAMETER_1}` is substituted into the query string before `EXECUTE IMMEDIATE` runs it, avoiding unbound parameter errors.

 

Why `:PARAMETER_1` Worked in v2025.10
1. **Named Parameter Marker Syntax**: The `:PARAMETER_1` syntax represents a named parameter marker that requires explicit binding of a value before execution. In earlier versions like v2025.10, this behavior might have been supported implicitly in certain contexts (e.g., workflows or API calls) where the parameter was automatically bound to a widget or variable.
2. Potential Unintended Behavior: If `:PARAMETER_1` worked without explicit binding, it could have been due to leniency or an oversight in the runtime's implementation rather than a deliberate feature. Such behavior might not be consistent across all scenarios or runtime versions.

 

Changes Between v2025.10 and v2025.15
1. Stricter Enforcement of Parameter Binding: Starting with newer Databricks Runtime versions (e.g., v2025.15), there has been a shift toward stricter enforcement of named parameter marker syntax (`:PARAMETER_1`). This aligns with best practices for SQL injection prevention and formal parameterization.
2. Deprecation of Implicit Binding: If `:PARAMETER_1` worked without explicit binding in v2025.10, this was likely unintended and has since been addressed to require proper mapping of parameters to values.

 

Should You Avoid Using `:PARAMETER_1`?
Yes, you should avoid relying on `:PARAMETER_1` without explicit binding because:
- It may not work consistently across different runtime versions or contexts (e.g., workflows vs. interactive notebooks).
- Databricks documentation strongly recommends using named parameter markers only when explicitly bound via APIs, widgets, or other mechanisms.
- The `${}` syntax remains more reliable for interactive workflows where direct substitution is sufficient.

 

Best Practices Moving Forward
To ensure compatibility and avoid runtime errors:
- Use `${PARAMETER}` for simple string interpolation in notebooks and workflows where values are defined via widgets or variables.
- Use `:PARAMETER_NAME` only when explicitly binding parameters through APIs or structured workflows.
- Always test queries on the target Databricks Runtime version to confirm expected behavior.

 

Whenever someone comes from a heavy SQL background, they find some of their favourite toys missing from the Spark SQL world - but there have been huge investments from Databricks to catch up here over the past few months. We thought it was time to talk SQL Variables, Parameterised Tables and even

iamgoce
New Contributor III

@BigRoux thanks for the detailed explanation, that makes more sense.

Join Us as a Local Community Builder!

Passionate about hosting events and connecting people? Help us grow a vibrant local community—sign up today to get started!

Sign Up Now