Engage in discussions on data warehousing, analytics, and BI solutions within the Databricks Community. Share insights, tips, and best practices for leveraging data for informed decision-making.
Here's your Data + AI Summit 2024 - Warehousing & Analytics recap as you use intelligent data warehousing to improve performance and increase your organization’s productivity with analytics, dashboards and insights.
Keynote: Data Warehouse presente...
Hey,currently, there is the function to export AI/BI dashboards as PDF in the UI, but it would be good to also have this option per API. API-first is important for big organizations to scale features. In our case, we want to send a personalized email...
Thanks for the suggestions.Row level security will not work for us as the datasets are internally shared without restrictions. The pre-selection of the manager is just for convenience, not for compliance.Making copies could work, but honestly I don't...
Hello Everyone,I recently started using AI/BI dashboard and I'm finding it difficult to cross-filter two visuals from two different dataset. Visual 1 displays Sales by region (month to date) sourced from dataset 1 and Visual 2 displays Open orders by...
We are finding the same issue that you cannot cross filter on multiple datasets. This would be an incredibly useful feature. Our scenario is we have a trip with a set of deliveries. We want to find inefficient trips based on trip level KPIs and the...
Databricks launching a Free Edition and committing $100M to data + AI education isn’t just about free access — it’s about changing how people learn data engineering.When engineers learn on a unified platform, not a stitched-together toolchain, they s...
Dear community,We are running nightly businessvaults. Last year it stopped to finish on classic wh and after testing until completion when switching to serverless wh it stayed that way but costs have increased a lot. I have been testing numerous spar...
I saw that in this topic a reply was selected as a solution. Sadly this is not the case and we are still in limbo with this issue. I tried setting up a meeting through support for databricks on azure but the third party rep microsoft provided did not...
Hi -- I'm trying to connect to BigQuery as a foreign catalog. I'm able to create and successfully test the connection, but when I create a foreign catalog it appears empty, and queries against that catalog return a "TABLE_OR_VIEW_NOT_FOUND" error.The...
Hello, I had the same problem (foreign catalog for BigQuery project with limited access) and was able to solve it with the SQL command "REFRESH FOREIGN CATALOG catalog_name". It took a while, but after it finished with "OK", all BigQuery tables could...
Greetings,I'm writing this message because I want to understand how does the "automatic refresh" feature work for AI/BI dashboards that use SQL Serverless endpoints?I'm asking because sometimes the published dashboard refreshes when viewing the link ...
I have the same impression. But I guess besides the parameters time related functions (or any?) included in the source SQL must trigger the refresh. Can this behaviour be switched off somehow? Using a cached version?
Does anyone know if there is a way to get anchor links working in Databricks notebooks so you can jump to sections in the same book without a full page refresh? i.e. something that works like the following html:<a href="#jump_to_target">Jump</a>...<p...
@hobrob_ex , yes, this is possible, but not like the HTML way; instead, you will have to use the markdown rendering formats.
Add #Heading 1, #Heading 2.. so on in the (+Text) button of the notebook. Once these headings/ sections that you want are con...
Hi everyone, I hope you're all doing well! I'm currently working on an AI/BI dashboard designed for business users, where the goal is for users to simply view and interact with the data in the dashboard. I've asked this question before to the Agent a...
Ultimately, your ask is essentially what happens when you're creating a new dashboard and you select some filter fields from the right side. The challenge here, however, is that you want this to be available to the user, and not simply at creation ti...
I am creating a dashboard, and I am looking to implement the scenario below.How do I create a cascading dropdown where selecting one field filters the available values in another dropdown?In a dropdown, how do I show a description for each option for...
Create a dataset that returns the “parent” values . Add filter A and connect it to that dataset field. Create a dataset for “child” values whose query is parameterized by the parent selection, or simply filters on the parent field: Configure the filt...
Hey @satyambaranwalc , Communithy Edition has been depricated. You want to use the NEW and IMPROVED Free Edition. You can sign up here: https://www.databricks.com/learn/free-edition
Hope this helps, Louis.
Hi all!Need help configuring ODBC connection on a Mac. I'm receiving the error "The web server failed to start, please verify if port 8020 is being used by another connection or annother application".I've downloaded and installed Databricks ODBC con...
Hi,I have created a function that I have applied as a row filter function to multiple tables.The function takes one input parameter (a column value from the table). It then uses session_user() to look up a user in our users table. If the user is foun...
Hi! Sorry for the late reply, lots of holidays No, the user table does not have a row filter in it. The structure is like this:schema1.users: contains a list of users obviously, and it has a column with a customer id for the customer they are allowed...
Hello,we have already spent surprisingly many DBUs, although we have only uploaded a few tiny tables (9 Tables with approx. 10 lines).We had the idea to change the warehouse from serverless starter warehouse to classic 2x small in order to save DBUs....
Hello @jasmin_mbi!
Did the suggestion shared above help resolve the issue with creating a classic SQL warehouse? If yes, please consider marking the response as the accepted solution.
I created a Dataflow Gen2 to get data from Databricks. I can see the preview data very quickly (around 5 seconds). But when I run the dataflow, it takes 8 hours and then cancels with a timeout. I’m trying to get 8 tables with the same schema. Six of...