cancel
Showing results for 
Search instead for 
Did you mean: 
Warehousing & Analytics
Engage in discussions on data warehousing, analytics, and BI solutions within the Databricks Community. Share insights, tips, and best practices for leveraging data for informed decision-making.
cancel
Showing results for 
Search instead for 
Did you mean: 

Forum Posts

MadelynM
by Databricks Employee
  • 3008 Views
  • 0 replies
  • 0 kudos

[Recap] Data + AI Summit 2024 - Warehousing & Analytics | Improve performance and increase insights

Here's your Data + AI Summit 2024 - Warehousing & Analytics recap as you use intelligent data warehousing to improve performance and increase your organization’s productivity with analytics, dashboards and insights.  Keynote: Data Warehouse presente...

Screenshot 2024-07-03 at 10.15.26 AM.png
Warehousing & Analytics
AI BI Dashboards
AI BI Genie
Databricks SQL
  • 3008 Views
  • 0 replies
  • 0 kudos
Ajbi
by New Contributor II
  • 2434 Views
  • 0 replies
  • 0 kudos

Power BI refresh from Databricks workflow

We recently enabled the preview feature to refresh power bi datasets from data bricks workflow, but when we run the job data changes aren't being applied despite a completed status in refresh history. Could anyone clarify whether this is intended pre...

Warehousing & Analytics
Databricks Power BI Task
power bi refresh
  • 2434 Views
  • 0 replies
  • 0 kudos
tbmadhav
by New Contributor
  • 653 Views
  • 1 replies
  • 0 kudos

Convert a delta table (flattened json format) to a nested java object

I have a delta table with columns like below format. I am querying Databricks using databricks-jdbc 2.7.1 driver. I want to convert the result set to a nested java object.Order_actualPickupAddress_residential -> stringOrder_pickupRequestDetails -> ar...

  • 653 Views
  • 1 replies
  • 0 kudos
Latest Reply
Louis_Frolio
Databricks Employee
  • 0 kudos

Here are some suggestions/ideas to consider:   To map hundreds of nested fields from a Databricks Delta table to complex Java objects like the Order object described, consider the following approaches:   1. Using Libraries for Object Mapping Utilize ...

  • 0 kudos
pavel_cerny
by New Contributor II
  • 4921 Views
  • 2 replies
  • 1 kudos

Databricks JDBC driver fails with socket read timeout

Application connects to Databricks serverless SQL warehouse via Databricks JDBC driver. It executes SQL select statements only. We see small number of statements failed each day with the following error detail: java.sql.SQLException: [Databricks][JDB...

  • 4921 Views
  • 2 replies
  • 1 kudos
Latest Reply
kamal_ch
Databricks Employee
  • 1 kudos

Interesting issue, looks like a Cluster Load issue. Recommend filing a support ticket with Databticks to investigate, in case if you haven't already

  • 1 kudos
1 More Replies
Akshay_Petkar
by Valued Contributor
  • 3267 Views
  • 1 replies
  • 0 kudos

Best Approach to Connect Quick Book with Databricks

What’s the best and seamless way to connect Quick Books data to Databricks? Is there any recommended third-party tool or direct connection using JDBC/ODBC?

  • 3267 Views
  • 1 replies
  • 0 kudos
Latest Reply
kamal_ch
Databricks Employee
  • 0 kudos

Hi Akshay,   You can use third-party tools that specialize in integrating with external platforms like Databricks. Tools such as CData, QODBC, or similar database connectors provide JDBC and ODBC drivers for QB. These drivers allow you to query Quick...

  • 0 kudos
lucy-ji
by New Contributor III
  • 2219 Views
  • 5 replies
  • 1 kudos

Resolved! How to download widget result into CSV

In the databricks Tutorial (https://docs.databricks.com/gcp/en/sql/get-started/sample-dashboards) , it says we can right click the widget and choose "Download as CSV".  However, in my databricks dashboard, when I right click the widgets, there is no ...

  • 2219 Views
  • 5 replies
  • 1 kudos
Latest Reply
lucy-ji
New Contributor III
  • 1 kudos

I find the reason is that my Admin didn't enable "SQL result download" option. Now it is solved. Thanks ALL.

  • 1 kudos
4 More Replies
Kaz
by New Contributor II
  • 9814 Views
  • 3 replies
  • 1 kudos

Show full logs on job log

Is it possible to show the full logs of a databricks job? Currently, the logs are skipped with:*** WARNING: max output size exceeded, skipping output. ***However, I don't believe our log files are more than 20 MB. I know you can press the logs button...

  • 9814 Views
  • 3 replies
  • 1 kudos
Latest Reply
Isi
Honored Contributor III
  • 1 kudos

Hey @Kaz ,Unfortunately, the output truncation limit in the Databricks job UI cannot be changed. Once that limit is exceeded, the rest of the logs are skipped, and the full logs become accessible only through the “Logs” button, which, as you mentione...

  • 1 kudos
2 More Replies
teixeire
by New Contributor II
  • 1614 Views
  • 3 replies
  • 1 kudos

How to restrict external access to SQL Warehouse but allow workspace queries?

Hi everyone,I'm currently setting up access controls in our Databricks development workspace. The goal is to enable business users to explore data and build their SQL skills within the workspace itself (e.g., via SQL editor or notebooks), but prevent...

Warehousing & Analytics
Access Rights
Endpoint
sql
  • 1614 Views
  • 3 replies
  • 1 kudos
Latest Reply
Isi
Honored Contributor III
  • 1 kudos

Hi @teixeire ,To prevent external tools like Power BI or DBeaver from connecting to your SQL Warehouse, one effective approach is to restrict personal access token (PAT) creation for users who should only query data inside the Databricks workspace.Th...

  • 1 kudos
2 More Replies
meghana_tulla
by New Contributor III
  • 1358 Views
  • 5 replies
  • 0 kudos

UCX V0.58.0 Dashboard in Databricks Showing 0 Tables Despite Existing Tables.

Hi,I am currently installing UCX version 0.58.0 in Non-Unity enabled Databricks Workspace. It installed successfully and the job completes without errors. However, The Dashboard that goanna generate showing "0 Total Tables" and other metrics as 0.Has...

  • 1358 Views
  • 5 replies
  • 0 kudos
Latest Reply
meghana_tulla
New Contributor III
  • 0 kudos

Hi,Still facing the same issue actually, I am trying to automate the UCX installation using bashscript. while running the bashscript the job starts and run successfully with all run jobs. Even all jobs Ran successfully, I am getting 0 tables and 0 va...

  • 0 kudos
4 More Replies
Sweta
by New Contributor II
  • 591 Views
  • 1 replies
  • 0 kudos

Optimized option to write updates to Aurora PostgresDB from Databricks/spark

Hello All,    We want to update our postgres tables from our spark structured streaming workflow on Databricks. We are using foreachbatch utility to write to this sink. I want to understand an optimized way to do this at near real time latency avoidi...

  • 591 Views
  • 1 replies
  • 0 kudos
Latest Reply
Walter_C
Databricks Employee
  • 0 kudos

  If near real-time latency is critical, focus on optimizing parallel writes with batch updates (Option 1).If you prioritize transactional stability and can tolerate slightly higher latency due to staging, continue refining your temporary table with ...

  • 0 kudos
AP01
by New Contributor
  • 1553 Views
  • 1 replies
  • 0 kudos

Databricks JDBC Error: Job Aborted Due to Stage Failure (Executor OOM - Error Code 52)

java.sql.SQLException: [Databricks][JDBCDriver](500051) ERROR processing query/statement. Error Code: 0, SQL state: null, Query: SELECT `ma***, Error message from Server: org.apache.hive.service.cli.HiveSQLException: Error running query: org.apache.s...

Warehousing & Analytics
Databricks JDBC SparkSQL OOM HiveThriftServer Error500051
Databricks SQL
JDBC Driver
SparkSQL
sql
  • 1553 Views
  • 1 replies
  • 0 kudos
Latest Reply
Alberto_Umana
Databricks Employee
  • 0 kudos

The executor does not seem to have enough memory to process the assigned tasks, OOM error.

  • 0 kudos
Kyle2
by New Contributor II
  • 2387 Views
  • 3 replies
  • 1 kudos

Databricks JDBC driver fails with socket read timeout

We work with a application that connects to our Databricks serverless SQL warehouse via Databricks JDBC driver. It runs a few thousand SQL select statements everyday, and a small percentage of them will fail with the following error details: java.sql...

  • 2387 Views
  • 3 replies
  • 1 kudos
Latest Reply
Rafael-Sousa
Contributor II
  • 1 kudos

Did you tried to increase the socket timeout?

  • 1 kudos
2 More Replies
meghana_tulla
by New Contributor III
  • 564 Views
  • 1 replies
  • 0 kudos

Issue with UCX Dashboard Display on Databricks v0.5.7 Installation

Hi,I am trying to install UCX for Databricks workspace using the Databricks CLI. When I attempt to install a specific older version instead of the latest, I get a dashboard creation error. This seems to be due to the April 7 update, which removed the...

  • 564 Views
  • 1 replies
  • 0 kudos
Latest Reply
Advika
Databricks Employee
  • 0 kudos

Hello @meghana_tulla! Older versions of UCX may fail to install because they try to create legacy dashboards, which are no longer supported following the April 7 update. To work around this, make sure your Databricks environment is aligned with the l...

  • 0 kudos
ahid
by New Contributor
  • 1128 Views
  • 2 replies
  • 0 kudos

Databricks Connector for Looker Studio – No Aggregation Pushdown + 1M Row Limit

Hi Databricks Community,I'm trying to understand which team is responsible for maintaining the Databricks Connector for Looker Studio . We’re currently facing a major performance bottleneck with how this connector operates.Specifically:The connector ...

  • 1128 Views
  • 2 replies
  • 0 kudos
Latest Reply
AndreyMirskiy
Databricks Employee
  • 0 kudos

Thank you for the feedback! Unfortunately there is a limitation in Looker Studio Community Connector API. getData method does not specify aggregation expectations for the data source. Therefore, a connector is expected to retrieve non-aggregated resu...

  • 0 kudos
1 More Replies
iamgoce
by New Contributor III
  • 2199 Views
  • 4 replies
  • 2 kudos

Resolved! Using Parameters in EXECUTE IMMEDIATE on Databricks SQL 2025.15 not working

Hi everyone,We are having an issue using parameters with EXECUTE IMMEDIATE statements, when running on SQL serverless running DBSQL v2025.15 (currently in preview channel):declare or replace var query = "SELECT :PARAMETER_1"; EXECUTE IMMEDIATE query;...

iamgoce_0-1744379589689.png iamgoce_0-1744379939492.png
  • 2199 Views
  • 4 replies
  • 2 kudos
Latest Reply
iamgoce
New Contributor III
  • 2 kudos

@Louis_Frolio thanks for the detailed explanation, that makes more sense.

  • 2 kudos
3 More Replies
Vasu_Kumar_T
by New Contributor II
  • 570 Views
  • 1 replies
  • 0 kudos

Blade bridge products Availability

Hello All,I wanted to touch base regarding the training and tool for Bladebridge Analyzer that we currently have access to. This has been quite useful in understanding and showcasing its capabilities to our customers.However, as we move forward, we a...

  • 570 Views
  • 1 replies
  • 0 kudos
Latest Reply
Renu_
Valued Contributor II
  • 0 kudos

Hi @Vasu_Kumar_T, Did you try the course mentioned on this page: https://www.databricks.com/blog/welcoming-bladebridge-databricks-accelerating-data-warehouse-migrations-lakehouseIf you are a Databricks partner, you can access BladeBridge migration tr...

  • 0 kudos