Engage in discussions on data warehousing, analytics, and BI solutions within the Databricks Community. Share insights, tips, and best practices for leveraging data for informed decision-making.
Here's your Data + AI Summit 2024 - Warehousing & Analytics recap as you use intelligent data warehousing to improve performance and increase your organization’s productivity with analytics, dashboards and insights.
Keynote: Data Warehouse presente...
I followed the official Databricks documentation("https://docs.databricks.com/en/_extras/notebooks/source/mongodb.html")to integrate MongoDB Atlas with Spark by setting up the MongoDB Spark Connector and configuring the connection string in my Datab...
I’m currently working with two workspaces – one for DEV and one for PROD.I’m trying to understand how I can keep the Genie/Dashboard functionalities in sync (mirrored) between these two environments. What is the best way to organize this workflow?Ide...
Hi ATN,
Good question. Based on what I know, dataset_catalog and dataset_schema in the dashboard DAB resource currently apply to datasets that use SQL queries (i.e., the queryLines property), but they do not resolve for datasets that reference metr...
When I click on a trace in the Databricks MLflow UI, it defaults to opening a "summary" tab. I have a somewhat complex trace, so this summary is getting bloated and I would like to clean it up.Nowhere, as far as I can tell, is the summary tab describ...
Hi @lkt1,
I took a quick look and tested it in my workspace to better understand the issue you mentioned.
As of today, the Summary tab isn’t user‑configurable. What’s displayed there and how inputs/outputs are collapsed, or truncated, is enti...
I created an external connection to an Oracle Database 12c Standard Edition Release 12.1.0.2.0.The connection is working, I executed successfully a lot of queries with different filters and joins. But I couldn't find a way to make any date filter pus...
Hi @diogofreitaspe,
This is a known behavior with how the Lakehouse Federation Oracle connector serializes date and timestamp literals during predicate pushdown. When Databricks pushes the filter down to Oracle, the date value gets rendered as a bare...
Hi everyone,I’m exploring the new Databricks Metric Views (Semantic Layer) and have two questions regarding programmatic management and UI visualization.1. Parser Disparity: spark.sql vs. SQL WarehouseI'm noticing that CREATE OR REPLACE VIEW ... WITH...
Hi @smpa011,
METRIC VIEW DDL ON ALL-PURPOSE COMPUTE (spark.sql / %sql)
The CREATE VIEW ... WITH METRICS DDL requires Databricks Runtime 17.2 or above. This applies to both SQL warehouses and all-purpose clusters. If your notebook is attached to a clu...
With the new metric views, I am unable to understand the grouping logic in the following setup:I have a table with timestamps and I define dimensions as follows:dimension:
- name: timestamp
expr: timestamp
- name: date
expr: DATE(timestam...
Hi @Malthe,
This is a nuanced aspect of how metric views resolve window measure dimensions. The key behavior you are seeing comes down to how the metric view engine matches your query's GROUP BY columns to the dimensions defined in the window clause....
HiWe are in the process of designing and building new silver and gold layers (Star Schema). We will be using Databricks, which is new to the organisation.The silver layer will be modelled using classic 3NF, with SCD 7.The gold layer is a star schema....
Hi @RobTScot,
This is a common design decision in lakehouse data modeling, and the right answer depends on the layer, the tooling, and the downstream consumers. Here is a breakdown of the key considerations.
SILVER LAYER (3NF WITH SCD 7)
For silver t...
Hi @Hubert-Dudek I am getting following error: METRIC_VIEW_WINDOW_FUNCTION_NOT _SUPPORTED The metric view is not allowed to use window function (...) With the following definition:- name: Sales net Total- expr: SUM(MEASURE(`Sales net`)) OVER())Howeve...
Hi @wrosa,
The error you are seeing (METRIC_VIEW_WINDOW_FUNCTION_NOT_SUPPORTED) is expected behavior. Metric views do not allow raw SQL window functions like SUM(...) OVER() directly in a measure's expr definition. This is by design because metric vi...
Hi everyone!Just wanted to jump on here to see if anyone is having issues with Connecticut not mapping on their choropleth map? We had several we were using to map # of healthcare providers in the country but just noticed that Connecticut was no long...
Hi,
This is a known issue that stems from Connecticut's 2022 county reorganization and how the AI/BI Dashboard choropleth map resolves FIPS codes.
WHAT'S HAPPENING
In June 2022, the US Census Bureau officially replaced Connecticut's 8 traditional cou...
Error: dataframe.display() doesn't support data aggregation. Use display(dataframe) for better results in Databricks notebooks.But I don't use dataframe.display! I use display(dataframe). This error occurs when creating a visualization in a databrick...
Hi @Kaz1,
I understand the frustration -- the error message is misleading because you ARE already using display(dataframe), which is the correct syntax. Let me explain what is actually happening and how to work around it.
WHAT IS HAPPENING
When you c...
Hi guys,I am making a Dashboard where I want to show the AOV per Brand with a date filter to change the days used in the calculation.AOV = sales / ordersI have sales and orders per day and would like to have the AOV dependant on the Days selected.The...
Hi @a_d - Thanks for confirming.
Here is a good example.
I've also tested it and given some snapshots below if it helps. My data looks something like the below.
You can see above the results (on the right side) an option to create a custom calcula...
I'm developing a FastAPI middleware app (Databricks App) that connects to both a SQL Warehouse (Unity Catalog) and a Lakebase PostgreSQL instance using async SQLAlchemy. The app works perfectly when deployed to Databricks, but I'm trying to set up lo...
Summary You should be able to test your FastAPI endpoints locally. Lakebase supports direct external connections via the standard PostgreSQL wire protocol, meaning your local SQLAlchemy setup can directly query the Lakebase instance without needing t...
My team and I are building a Dashboard to present to our client (imbedded within an I-frame). Very recently, the color palette of the visualizations as well as the general theme section seem broken. To be more specific, if I try to update the color o...
Thanks for all the input team. I have escalated this through our data bricks support desk and they were able to resolve this quickly. The issue was resolved on Feb 14th 2026.
Hi, I'm trying to embed a Databricks dashboard, but the language for labels like "Global Filters" always appears as English. Is there a way to make the dashboard display in Japanese when users view it as an iframe?I've already set the dashboard local...
Hi, to do a feature request you'd need to speak to a member of your account team. If you're not sure who this is talk to your databricks administrator in your organisation. Thanks, Emma
So, I can't figure out how to do moving average as custom calculation in Databricks dashboard. I'm applying many different filters and the denominator of the metric has to change dynamically based on the chosen filters. So, in this case using `Custom...
You’re running into a current limitation in Databricks AI/BI dashboards. Right now, a moving average cannot be applied on top of a Custom Calculation when that metric itself is dynamically recalculated based on filters. Since your denominator changes...