Inquiry About Adding Filters on Notebook Dashboard
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
01-10-2025 12:28 AM
Subject: Inquiry About Adding Filters on Notebook Dashboard
Hi
I have recently created some visuals on Notebook and added them to the Notebook dashboard. However, I am unable to find a way to add filters to the dashboard. I have looked through the available options but have not found anything related to filters.
Please note that I am referring to the Notebook dashboard and not the SQL Dashboard.
I would appreciate any guidance on how I can add filters to the Notebook dashboard.
- Labels:
-
Delta Lake
-
Spark
-
Workflows
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
01-10-2025 12:34 AM
@amarnathpal Thank you for bringing this.
Currently, it is not possible to add filters directly to a Notebook dashboard in Databricks. The available options for adding filters are specific to AI/BI dashboards, which offer more advanced features such as cross-filtering, independent user sessions, and better scalability and management.
If you need to use filters, you might consider transitioning your notebook visualizations to an AI/BI dashboard. This can be done by using the "Add to Dashboard" option available for SQL cells in notebooks. This integration allows you to move your SQL cell content, including queries, parameters, and visualizations, to an AI/BI dashboard where you can then add and configure filters as needed
Check this: https://www.databricks.com/blog/present-and-share-notebook-results-in-aibi-dashboards
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
01-10-2025 01:10 AM
Hi @amarnathpal ,
Please follow the steps below (1 to 5) from snapshot to add filters to your dashboard. Additionally, you may need to explore using parameters to customize your filters for optimized value selections.
Regards,
Hari Prasad
Regards,
Hari Prasad
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
01-10-2025 01:27 AM
Hi @hari-prasad ,
Thank you for your prompt response. The solution you provided seems suitable for an SQL Dashboard. However, my inquiry was specifically about the capabilities of the Notebook Dashboard.
To elaborate, I have displayed data in a notebook and created visuals accordingly. Upon adding these visuals to the dashboard using the "Add to Notebook Dashboard" option, it appears that the Notebook Dashboard lacks several features that I was hoping to access.
Could you please confirm if there are additional functionalities or workarounds available for enhancing the Notebook Dashboard, such as implementing filters directly within it?
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
01-10-2025 02:21 AM - edited 01-10-2025 02:21 AM
Then you can use alternate way, which is to use widgets in databricks which you can leverage to parametrize your notebook and queries. And any changes to parameter will auto trigger the code block where you read parameter in cell.
Also, you can set action on widget changes like sepecific cell or complete notebook.
Here is a sample Python code. Please note that the auto-trigger/execute functionality for specific cells only works with Python. For SQL, you need to templatize your queries in spark.sql. Ensure you include the following code line dbutils.widgets.get("catalog_name_param") in the cell to enable auto-triggering.
# To create a widget in the notebook, use the following command:
dbutils.widgets.text("catalog_name_param", "")
# Read parameter in notebook with templatized SQL query
query = """
select * from system.information_schema.catalog_privileges
where catalog_name like :catalog_name_param
"""
args = {"catalog_name_param": dbutils.widgets.get("catalog_name_param")}
df = spark.sql(query, args)
display(df)
Regards,
Hari Prasad

