cancel
Showing results for 
Search instead for 
Did you mean: 
Warehousing & Analytics
Engage in discussions on data warehousing, analytics, and BI solutions within the Databricks Community. Share insights, tips, and best practices for leveraging data for informed decision-making.
cancel
Showing results for 
Search instead for 
Did you mean: 

Forum Posts

MadelynM
by Databricks Employee
  • 3106 Views
  • 0 replies
  • 0 kudos

[Recap] Data + AI Summit 2024 - Warehousing & Analytics | Improve performance and increase insights

Here's your Data + AI Summit 2024 - Warehousing & Analytics recap as you use intelligent data warehousing to improve performance and increase your organization’s productivity with analytics, dashboards and insights.  Keynote: Data Warehouse presente...

Screenshot 2024-07-03 at 10.15.26 AM.png
Warehousing & Analytics
AI BI Dashboards
AI BI Genie
Databricks SQL
  • 3106 Views
  • 0 replies
  • 0 kudos
bferrell
by New Contributor II
  • 2861 Views
  • 6 replies
  • 2 kudos

Connecting to SQL Warehouse in Custom App

I've got a custom Dash app I've written and am attempting to deploy. It runs fine on my local machine (while accessing my DB SQL Warehouse), but when I try deploying to Databricks, it cannot connect to the data for some reason. I was basically follow...

  • 2861 Views
  • 6 replies
  • 2 kudos
Latest Reply
Isi
Honored Contributor III
  • 2 kudos

We can wait till you check the permissions @bferrell @bferrell Best regards,Isi

  • 2 kudos
5 More Replies
brld
by New Contributor II
  • 1399 Views
  • 1 replies
  • 0 kudos

Resolved! Universal/cross-tab filters in dashboard

With the recent addition of dashboard tabs in databricks, I could not find a way to have filters apply to multiple tabs within the same dashboard - so far I have had to manually create and apply filters to each of my tabs individually. 

  • 1399 Views
  • 1 replies
  • 0 kudos
Latest Reply
Advika
Databricks Employee
  • 0 kudos

Hello @brld! Currently, dashboards do not support applying filters automatically across multiple tabs. You could try using a global parameter, this parameter will be available to all widgets using the same dataset or query. However, the filter compon...

  • 0 kudos
Akshay_Petkar
by Valued Contributor
  • 2157 Views
  • 1 replies
  • 0 kudos

AWS Databricks external tables are delta tables?

If I create an external table on AWS Databricks, will it be a Delta table? If not, is there a way to make it a Delta table, or is there no Delta capability for external tables?

  • 2157 Views
  • 1 replies
  • 0 kudos
Latest Reply
Vidhi_Khaitan
Databricks Employee
  • 0 kudos

Hi Akshay, I believe you can try this for your use case -> CREATE TABLE IF NOT EXISTS catalog.schema.my_external_table ( id INT, name STRING, age INT ) USING delta LOCATION '<location>' This will create a delta table.

  • 0 kudos
ossinova
by Contributor II
  • 4710 Views
  • 3 replies
  • 0 kudos

Resolved! Coss page filters - AI/BI Dasboards

When creating a dashboard with multiple pages connected to one dataset. It seems that only visual elements on the same page that the filter is on takes effect. Is there a way to filter all visual elements regardless of which page the filter is on?I h...

  • 4710 Views
  • 3 replies
  • 0 kudos
Latest Reply
Alex_Lichen
Databricks Employee
  • 0 kudos

Hi folks, We are currently working on global filters, which will allow you to set a filter value or parameter value across multiple pages. Keep an eye out for that feature, coming soon!

  • 0 kudos
2 More Replies
DEShoaib
by New Contributor II
  • 6426 Views
  • 7 replies
  • 1 kudos

Fabric with Databricks

Do we have same functionality if we use Databricks with Fabric as it provides with Azure?

  • 6426 Views
  • 7 replies
  • 1 kudos
Latest Reply
dks
New Contributor II
  • 1 kudos

We aim to implement Databricks Mirroring through the Fabric APIs for automation. However, the Mirroring API specifically states that it is not compatible with Databricks. Are there alternative APIs that could be used to achieve this functionality?

  • 1 kudos
6 More Replies
Dave_Nithio
by Contributor II
  • 3577 Views
  • 1 replies
  • 0 kudos

Monitoring Databricks SQL Warehouse without Unity Catalog

I am looking to monitor my SQL Warehouse, especially the 'Running Clusters' metric that is available in the monitoring tab of the warehouse. This shows the start and shut down time as well as the number of running clusters:The issue I have run into i...

Dave_Nithio_0-1741034562647.png
  • 3577 Views
  • 1 replies
  • 0 kudos
Latest Reply
Isi
Honored Contributor III
  • 0 kudos

Hey @Dave_Nithio To monitor the “Running Clusters” metric for your SQL Warehouse, you can use the Databricks Cluster Events API. This API retrieves a list of events related to cluster activity, such as start and shutdown times, and provides paginated...

  • 0 kudos
barchiel33
by New Contributor II
  • 4807 Views
  • 3 replies
  • 1 kudos

Databricks SQL Python - Result fetching takes extremely long time

Hello All!I have a python script which utilizes the databricks SQL for python package in order to pull a databricks table into a pandas dataframe which is used to create a table in a Spotfire report. The table contains ~1.28 million rows, with 155 co...

  • 4807 Views
  • 3 replies
  • 1 kudos
Latest Reply
Isi
Honored Contributor III
  • 1 kudos

Hey @barchiel33 ,After reviewing your context further, I believe the most effective approach would be to set up an automated pipeline within Databricks that periodically extracts data based on the frequency you need (daily, weekly, hourly, etc.), cre...

  • 1 kudos
2 More Replies
alex456897874
by New Contributor III
  • 2690 Views
  • 7 replies
  • 1 kudos

how to reuse filter between pages in dashboards?

Hi!I have a multi-page dashboard based on a handful of tables sharing a common run_id field.I created a dropdown filter widget on the first page to filter datasets based on the table's run_id field.It worked fine for this page, but how to reuse it fo...

  • 2690 Views
  • 7 replies
  • 1 kudos
Latest Reply
Benjamin_DavidS
New Contributor II
  • 1 kudos

I have the same problem here. filters are not applied while switching pages

  • 1 kudos
6 More Replies
tbmadhav
by New Contributor
  • 690 Views
  • 1 replies
  • 0 kudos

Convert a delta table (flattened json format) to a nested java object

I have a delta table with columns like below format. I am querying Databricks using databricks-jdbc 2.7.1 driver. I want to convert the result set to a nested java object.Order_actualPickupAddress_residential -> stringOrder_pickupRequestDetails -> ar...

  • 690 Views
  • 1 replies
  • 0 kudos
Latest Reply
Louis_Frolio
Databricks Employee
  • 0 kudos

Here are some suggestions/ideas to consider:   To map hundreds of nested fields from a Databricks Delta table to complex Java objects like the Order object described, consider the following approaches:   1. Using Libraries for Object Mapping Utilize ...

  • 0 kudos
pavel_cerny
by New Contributor II
  • 5266 Views
  • 2 replies
  • 1 kudos

Databricks JDBC driver fails with socket read timeout

Application connects to Databricks serverless SQL warehouse via Databricks JDBC driver. It executes SQL select statements only. We see small number of statements failed each day with the following error detail: java.sql.SQLException: [Databricks][JDB...

  • 5266 Views
  • 2 replies
  • 1 kudos
Latest Reply
kamal_ch
Databricks Employee
  • 1 kudos

Interesting issue, looks like a Cluster Load issue. Recommend filing a support ticket with Databticks to investigate, in case if you haven't already

  • 1 kudos
1 More Replies
Akshay_Petkar
by Valued Contributor
  • 3592 Views
  • 1 replies
  • 0 kudos

Best Approach to Connect Quick Book with Databricks

What’s the best and seamless way to connect Quick Books data to Databricks? Is there any recommended third-party tool or direct connection using JDBC/ODBC?

  • 3592 Views
  • 1 replies
  • 0 kudos
Latest Reply
kamal_ch
Databricks Employee
  • 0 kudos

Hi Akshay,   You can use third-party tools that specialize in integrating with external platforms like Databricks. Tools such as CData, QODBC, or similar database connectors provide JDBC and ODBC drivers for QB. These drivers allow you to query Quick...

  • 0 kudos
lucy-ji
by New Contributor III
  • 2302 Views
  • 5 replies
  • 1 kudos

Resolved! How to download widget result into CSV

In the databricks Tutorial (https://docs.databricks.com/gcp/en/sql/get-started/sample-dashboards) , it says we can right click the widget and choose "Download as CSV".  However, in my databricks dashboard, when I right click the widgets, there is no ...

  • 2302 Views
  • 5 replies
  • 1 kudos
Latest Reply
lucy-ji
New Contributor III
  • 1 kudos

I find the reason is that my Admin didn't enable "SQL result download" option. Now it is solved. Thanks ALL.

  • 1 kudos
4 More Replies
teixeire
by New Contributor II
  • 1774 Views
  • 3 replies
  • 1 kudos

How to restrict external access to SQL Warehouse but allow workspace queries?

Hi everyone,I'm currently setting up access controls in our Databricks development workspace. The goal is to enable business users to explore data and build their SQL skills within the workspace itself (e.g., via SQL editor or notebooks), but prevent...

Warehousing & Analytics
Access Rights
Endpoint
sql
  • 1774 Views
  • 3 replies
  • 1 kudos
Latest Reply
Isi
Honored Contributor III
  • 1 kudos

Hi @teixeire ,To prevent external tools like Power BI or DBeaver from connecting to your SQL Warehouse, one effective approach is to restrict personal access token (PAT) creation for users who should only query data inside the Databricks workspace.Th...

  • 1 kudos
2 More Replies
meghana_tulla
by New Contributor III
  • 1410 Views
  • 5 replies
  • 0 kudos

UCX V0.58.0 Dashboard in Databricks Showing 0 Tables Despite Existing Tables.

Hi,I am currently installing UCX version 0.58.0 in Non-Unity enabled Databricks Workspace. It installed successfully and the job completes without errors. However, The Dashboard that goanna generate showing "0 Total Tables" and other metrics as 0.Has...

  • 1410 Views
  • 5 replies
  • 0 kudos
Latest Reply
meghana_tulla
New Contributor III
  • 0 kudos

Hi,Still facing the same issue actually, I am trying to automate the UCX installation using bashscript. while running the bashscript the job starts and run successfully with all run jobs. Even all jobs Ran successfully, I am getting 0 tables and 0 va...

  • 0 kudos
4 More Replies
Sweta
by New Contributor II
  • 630 Views
  • 1 replies
  • 0 kudos

Optimized option to write updates to Aurora PostgresDB from Databricks/spark

Hello All,    We want to update our postgres tables from our spark structured streaming workflow on Databricks. We are using foreachbatch utility to write to this sink. I want to understand an optimized way to do this at near real time latency avoidi...

  • 630 Views
  • 1 replies
  • 0 kudos
Latest Reply
Walter_C
Databricks Employee
  • 0 kudos

  If near real-time latency is critical, focus on optimizing parallel writes with batch updates (Option 1).If you prioritize transactional stability and can tolerate slightly higher latency due to staging, continue refining your temporary table with ...

  • 0 kudos