Then you can use alternate way, which is to use widgets in databricks which you can leverage to parametrize your notebook and queries. And any changes to parameter will auto trigger the code block where you read parameter in cell.
Also, you can set action on widget changes like sepecific cell or complete notebook.
Here is a sample Python code. Please note that the auto-trigger/execute functionality for specific cells only works with Python. For SQL, you need to templatize your queries in spark.sql. Ensure you include the following code line dbutils.widgets.get("catalog_name_param") in the cell to enable auto-triggering.
# To create a widget in the notebook, use the following command:
dbutils.widgets.text("catalog_name_param", "")
# Read parameter in notebook with templatized SQL query
query = """
select * from system.information_schema.catalog_privileges
where catalog_name like :catalog_name_param
"""
args = {"catalog_name_param": dbutils.widgets.get("catalog_name_param")}
df = spark.sql(query, args)
display(df)
Regards,
Hari Prasad