cancel
Showing results for 
Search instead for 
Did you mean: 
Data Engineering
Join discussions on data engineering best practices, architectures, and optimization strategies within the Databricks Community. Exchange insights and solutions with fellow data engineers.
cancel
Showing results for 
Search instead for 
Did you mean: 

Facing issue while using widget values in sql script

Ashwathy
New Contributor II

I am using below code to create and read widgets. I am assigning default value.

dbutils.widgets.text("pname", "default","parameter_name")

pname=dbutils.widgets.get("pname")

I am using this widget parameter in some sql scripts. one example is given below

%sql

CREATE database if not exists ${pname}_test;

I am executing this notebook from an ADF. I am not passing any value to this parameter, so it will take default value.

Problem I am facing is, sometimes this widget value is reflecting and sometimes it is not reflecting in the sql script so it is creating "_test" database. This is happening in my dev environment. I am using exactly same code in my production environment, and there it is working fine.

Sometimes it is working all fine and sometimes I am facing this issue. Can anyone please help me to identify the issue?

2 REPLIES 2

Hi @Kaniz Fatma​ , I tried executing notebook using runtime 9.1 LTS and 10.4 LTS.

The problem is, I am using this sql scripts in python notebook. I can able to read widget values and able to print the widget values using python print statement. I am getting value as expected.

When I use same variable in a SQL script in the same notebook, this value is not reflecting there instead it is taking empty value.

I changed all my sql scripts to spark.sql() statements and now notebooks are running fine. But still I am not getting, why sql scripts are not working here.

takomyr
New Contributor II

https://docs.databricks.com/notebooks/widgets.html states

In general, you cannot use widgets to pass arguments between different languages within a notebook. You can create a widget arg1 in a Python cell and use it in a SQL or Scala cell if you run one cell at a time. However, this does not work if you use Run All or run the notebook as a job.

Connect with Databricks Users in Your Area

Join a Regional User Group to connect with local Databricks users. Events will be happening in your city, and you won’t want to miss the chance to attend and share knowledge.

If there isn’t a group near you, start one and help create a community that brings people together.

Request a New Group