05-22-2025 06:44 AM
Hello!
I'm trying to create an item in Power Bi Report Server (SSRS) connected to Databricks. I can connect normally, but I'm having trouble using a parameter that Databricks recognizes.
First, I'll illustrate what I do when I connect to SQL Server and use a parameter that works.
A = my parameter that I connect to item C;
B = my dataset where I create one parameter that is gonna link with A;
C = my entity dimension that I connect to item A;
In my query dataset (B) I create a paramter using @Entidade that SQL server recognizes as paramter, and after that I link to my parameter (A).
Everything works well and I got the expected result:
Now, when I try to do the same thing connected to databricks, I have problems to set the parameters on query.
In databricks I can use parameters as: {{ Entidade}} or :Entidade, but Isn't work on SSRS, always I get a sintax error. That parameters works in SQL editor and notebooks in databricks.
What kind of parameter can be used to solve this problem on SSRS?
example query connected to slq server:
select id, AtualizadoEm,EntidadeId, EntidadeNome, ContratoNome, PedidoId, PedidoDataIntegracao, MarcacaoData
from aco.dbo.Orders with (nolock)
where EntidadeNome = @Entidade
Example query connected to databricks:
select a.dt_periodo_registro, a.nm_entidade, a.nr_orders
from prd.dbo.orders_v2 as a
--where a.nm_entidade = '${entidade}' - nok
where a.nm_entidade = :entidade -- nok but works on databricks
-- where a.nm_entidade = {{entidade}} -- nok but works on databricks
-- where a.nm_entidade = {{ entidade }} -- nok but works on databricks
a week ago
It will most likely fail because because its a different connector which i dont think would work with powerbi server.
Your only bet is to try with positional arguments since you are using ODBC.
Select * from table where columnx = ? And columny = ?
Check this article out: https://medium.com/@kyle.hale/passing-multi-valued-parameters-from-power-bi-paginated-reports-to-dat...
It explains the steps in detail.
Tuesday - last edited Tuesday
Thanks for the suggestion. I wasn’t able to replicate the example from the article, but it pointed me in the right direction. With that, I managed to configure what I needed and it's worked 🙂
I’ll detail a exampel here:
1. I create the connection using the ODBC driver with Databricks.
2. I create a dataset (fact) with all the columns I want to display in the result table and validate the query (just a common select).
3. Then, I create another dataset (pEntity) with the distinct values of a categorical column, which will be used as one of my parameters (@entity_name).
4. In the Parameters menu, I create the @entity_name parameter and refer to my dataset pEntity, and two date parameters: @DateStart and @DateEnd.
5. Now comes the part that enables the use of parameters. I go back to my main dataset (fact) and add the expression below instead the common select, so I join with the parameters I created with the query. It’s not necessary to create the parameters in the Parameters sub-tab of the main dataset.
="SELECT code, entity_name, dt_register, in_active " &
"FROM prd.dbo.entity_resume " &
"WHERE in_active_reg = 1 " &
"AND entity_name IN ('" & Join(Parameters!entity_name.Value, "','") & "') " &
"AND dt_register BETWEEN DATE('" & Format(Parameters!DateStart.Value, "yyyy-MM-dd") & "') " &
"AND DATE('" & Format(Parameters!DateEnd.Value, "yyyy-MM-dd") & "')"
configs:
result:
05-23-2025 01:28 AM
working with SSRS can be a pain but it is possible.
Using openquery you can probably pull it off.
That's what we use anyway. We also do some dirty trick with defining databricks as a linked server and then use stored procedures on the linked server (for those reports that will not work otherwise). Can't tell you a lot about that because I do not build the reports myself.
05-23-2025 10:26 AM
We can't use linkedserver/procedures because we will stop using SQL management studio and focus only on lake/databricks. But it's a path that would work for now.
Because this we are looking for a solution 100% in databricks.
08-26-2025 03:16 AM
Did you find any solution for this?
08-26-2025 05:33 AM
Hello.
No, I didn't. I'm still looking for a solution.
I have tried to use a stored procedure in databricks, but PBI Report Builder still not recognized the parameters.
3 weeks ago
@fellipeao This is the only way I found that works well with databricks since positional arguments (?) was a fail for me. This is the latest version of paginated report builder.
https://learn.microsoft.com/en-us/power-bi/paginated-reports/report-builder/connect-snowflake-databr...
Once connected to databricks, you can choose any random table and then go to advanced editor and make these changes:
1 - Use Nativequery funtion to be able to insert your SQL query:
Follow the final steps in the documentation to bind the parameter with your Report parameter under dataset properties.
Good luck!
a week ago
Hello.
I tried this approach a few days ago, and the parameters work fine on my local machine using my token databricks. I got the expected result after run the query.
The main issue is that I'm building this object to be published on Power BI Report Server, which only supports connections to Databricks via an ODBC driver.
After publishing the object, I’m able to switch the connection to ODBC, test it successfully, and I can also see the parameters. However, when executing the query, it returns an error.
Testing odbc connection (ok):
checking my parameters (ok)
opening the report (ok)
Runnin the report (not ok)
I have tried many configurations, but I'm still have trouble.
Thanks for the suggestion.
a week ago
It will most likely fail because because its a different connector which i dont think would work with powerbi server.
Your only bet is to try with positional arguments since you are using ODBC.
Select * from table where columnx = ? And columny = ?
Check this article out: https://medium.com/@kyle.hale/passing-multi-valued-parameters-from-power-bi-paginated-reports-to-dat...
It explains the steps in detail.
Tuesday - last edited Tuesday
Thanks for the suggestion. I wasn’t able to replicate the example from the article, but it pointed me in the right direction. With that, I managed to configure what I needed and it's worked 🙂
I’ll detail a exampel here:
1. I create the connection using the ODBC driver with Databricks.
2. I create a dataset (fact) with all the columns I want to display in the result table and validate the query (just a common select).
3. Then, I create another dataset (pEntity) with the distinct values of a categorical column, which will be used as one of my parameters (@entity_name).
4. In the Parameters menu, I create the @entity_name parameter and refer to my dataset pEntity, and two date parameters: @DateStart and @DateEnd.
5. Now comes the part that enables the use of parameters. I go back to my main dataset (fact) and add the expression below instead the common select, so I join with the parameters I created with the query. It’s not necessary to create the parameters in the Parameters sub-tab of the main dataset.
="SELECT code, entity_name, dt_register, in_active " &
"FROM prd.dbo.entity_resume " &
"WHERE in_active_reg = 1 " &
"AND entity_name IN ('" & Join(Parameters!entity_name.Value, "','") & "') " &
"AND dt_register BETWEEN DATE('" & Format(Parameters!DateStart.Value, "yyyy-MM-dd") & "') " &
"AND DATE('" & Format(Parameters!DateEnd.Value, "yyyy-MM-dd") & "')"
configs:
result:
Tuesday
Yes, using expressions to build queries should also work. Slightly complex but good if you have simple queries. Good job!
Passionate about hosting events and connecting people? Help us grow a vibrant local community—sign up today to get started!
Sign Up Now