how to pull a parameter from .sql file with dbutils.notebook.run
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
02-06-2025 06:56 AM
Hi,
I want to use this:
I already tested with another user, is not that I am not sure.
Thanks in advance
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
02-06-2025 07:08 AM
Hi @carlos_tasayco,
When you use dbutils.notebook.run
, it expects the target to be a notebook that can accept parameters and execute code within the Databricks environment. This function does not directly support running .sql
files. Instead, you should place your SQL code inside a Databricks notebook (e.g., a .py
file) and then use dbutils.notebook.run
to execute that notebook.
Create a new Databricks notebook (e.g., run_sql_notebook.py
) and place your SQL code inside it. For example:
# run_sql_notebook.py
inputEnvironment = dbutils.widgets.get("environment")
sql_query = f"""
DROP TEMPORARY VARIABLE IF EXISTS strEnv;
DECLARE VARIABLE strEnv STRING;
SET VARIABLE strEnv = '{inputEnvironment}';
"""
spark.sql(sql_query)
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
02-06-2025 07:11 AM
Thanks,
ok forget dbutils.notebook.run there is anothwer way to get a parameter from a .sql file into my notebook?
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
02-06-2025 07:33 AM
@carlos_tasayco There are two methods on how you can pass a variable to the other notebooks as input.
- Using Widgets
- using collect method like below.
# In notebook1
result = spark.sql("SELECT value FROM table").collect()[0][0]
dbutils.notebook.exit(result)
# In notebook2
value = dbutils.notebook.run("notebook1", timeout_seconds=60)
Let me know for anything else.

