cancel
Showing results for 
Search instead for 
Did you mean: 
Warehousing & Analytics
Engage in discussions on data warehousing, analytics, and BI solutions within the Databricks Community. Share insights, tips, and best practices for leveraging data for informed decision-making.
cancel
Showing results for 
Search instead for 
Did you mean: 

Forum Posts

MadelynM
by Databricks Employee
  • 3117 Views
  • 0 replies
  • 0 kudos

[Recap] Data + AI Summit 2024 - Warehousing & Analytics | Improve performance and increase insights

Here's your Data + AI Summit 2024 - Warehousing & Analytics recap as you use intelligent data warehousing to improve performance and increase your organization’s productivity with analytics, dashboards and insights.  Keynote: Data Warehouse presente...

Screenshot 2024-07-03 at 10.15.26 AM.png
Warehousing & Analytics
AI BI Dashboards
AI BI Genie
Databricks SQL
  • 3117 Views
  • 0 replies
  • 0 kudos
ehsan1
by New Contributor II
  • 1516 Views
  • 1 replies
  • 1 kudos

Having issue in querying Data bricks End point using SQL Workbench/J

It was working fine initially.   Message: [Simba][SparkJDBCDriver](500618) Error occured while deserializing arrow data: sun.misc.Unsafe or java.nio.DirectByteBuffer.<init>(long, int) not available [SQL State=HY000, DB Errorcode=500618]

  • 1516 Views
  • 1 replies
  • 1 kudos
Latest Reply
Anonymous
Not applicable
  • 1 kudos

@Ehsan Ullah​ :The error message you received indicates that there is an issue with deserializing arrow data in the Spark JDBC driver. This error is caused by the fact that the sun.misc.Unsafe or java.nio.DirectByteBufferclasses are not available.To ...

  • 1 kudos
Priyag1
by Honored Contributor II
  • 5516 Views
  • 1 replies
  • 9 kudos

Refreshing SQL DashboardYou can schedule the dashboard to automatically refresh at an interval.At the top of the page, click Schedule.If the dashboard...

Refreshing SQL DashboardYou can schedule the dashboard to automatically refresh at an interval.At the top of the page, click Schedule.If the dashboard already has a schedule, you see Scheduled instead of Schedule.Select an interval, such as Every 1 h...

  • 5516 Views
  • 1 replies
  • 9 kudos
Latest Reply
samhita
New Contributor III
  • 9 kudos

Useful

  • 9 kudos
RichardSCarchit
by New Contributor III
  • 15315 Views
  • 5 replies
  • 2 kudos

How to query object ID in Databricks SQL warehouse using only SQL?

I can see on Databricks SQL warehouse Data tab that clusters, catalogs and schemas have a unique ID. User created tables, views and functions must have and unique ID too, but it is not exposed to the user as far as I can tell. I need to retrieve the ...

  • 15315 Views
  • 5 replies
  • 2 kudos
Latest Reply
Priyag1
Honored Contributor II
  • 2 kudos

Refer documentation once

  • 2 kudos
4 More Replies
MetaRossiVinli
by Contributor
  • 3619 Views
  • 1 replies
  • 1 kudos

Resolved! In Python, Streaming read by DLT from Hive Table

I am pulling data from Google BigQuery and writing it to a bronze table on an interval. I do this in a separate continuous job because DLT did not like the BigQuery connector calling collect on a dataframe inside of DLT.   In Python, I would like to ...

  • 3619 Views
  • 1 replies
  • 1 kudos
Latest Reply
MetaRossiVinli
Contributor
  • 1 kudos

The below code is a solution. I was missing that I could read from a table with `spark.readStream.format("delta").table("...")`. Simple. Just missed it. This is different than `dlt.read_stream()` which appears in the examples a lot.This is referenced...

  • 1 kudos
Salty
by New Contributor II
  • 4471 Views
  • 4 replies
  • 2 kudos

Resolved! DBSQL subscriptions method returning `410: Gone`

We've been using the DQBSQL API to perform CRUD on queries and alerts.Part of that process added a slack channel as alert destination using the /subscriptions element on an alert post as below.As of today I am getting a 410 'Gone' error from the API ...

  • 4471 Views
  • 4 replies
  • 2 kudos
Latest Reply
Salty
New Contributor II
  • 2 kudos

Hi @Suteja Kanuri​ , @Vidula Khanna​ thanks for getting back with a solution.The suggested solution looks fine, but for a number of reasons I went with another option to use the Jobs API. This allowed me to preserve more of the automation I had alrea...

  • 2 kudos
3 More Replies
Cricket_Clues
by New Contributor
  • 967 Views
  • 0 replies
  • 0 kudos

IPL Prediction is very challenging task. Due to the nature of the game, it is challenging to correctly predict the winner because even one player&#3...

IPL Prediction is very challenging task. Due to the nature of the game, it is challenging to correctly predict the winner because even one player's performance can significantly affect the outcome. It is also challenging to forecast which club woul...

processed-8495b0b6-8f16-4ddb-a3a6-fa4a74317b49_Xle2DZmp (1)
  • 967 Views
  • 0 replies
  • 0 kudos
MarSier
by New Contributor
  • 3274 Views
  • 2 replies
  • 0 kudos

Resolved! PrivateLink AWS - Databricks, "Cluster terminated. Reason: Security Daemon Registration Exception"

Hi FerArribas,I struggle with PrivateLink connection between Databricks account and my AWS account. I have seen that you had a similar problem. I can create a workspace, but when I try to create a cluster I get an error: "Cluster terminated. Reason: ...

  • 3274 Views
  • 2 replies
  • 0 kudos
Latest Reply
Anonymous
Not applicable
  • 0 kudos

Hi @Marcin Sieradzan​ Thank you for posting your question in our community! We are happy to assist you.To help us provide you with the most accurate information, could you please take a moment to review the responses and select the one that best answ...

  • 0 kudos
1 More Replies
thisisadarshsin
by New Contributor II
  • 4874 Views
  • 2 replies
  • 1 kudos

Loading PowerBi report in databricks..

Hi , I am trying to load power bi report in databricks,but i am getting..empty while printing report.. i have already installed pip install powerbiclient,and device_auth is also successfull,but report = Report(group_id=group_id, report_id=report_id, ...

image.png image
  • 4874 Views
  • 2 replies
  • 1 kudos
Latest Reply
Anonymous
Not applicable
  • 1 kudos

Hi @Adarsh Singh​ Thank you for posting your question in our community! We are happy to assist you.To help us provide you with the most accurate information, could you please take a moment to review the responses and select the one that best answers ...

  • 1 kudos
1 More Replies
marko
by New Contributor II
  • 22513 Views
  • 5 replies
  • 2 kudos

SQL Warehouse high number of concurrent queries

We are going to be a databricks customer and did some PoC tests. Our one test contains dataset in one partitioned table (15colums) is roughly 250M rows, each partition is ~50K-150K rows. Occasionally we have hundreds (up to one thousand) concurrent u...

  • 22513 Views
  • 5 replies
  • 2 kudos
Latest Reply
Anonymous
Not applicable
  • 2 kudos

Hi @Marian Kovac​ Hope all is well! Just wanted to check in if you were able to resolve your issue and would you be happy to share the solution or mark an answer as best? Else please let us know if you need more help. We'd love to hear from you.Thank...

  • 2 kudos
4 More Replies
kori73
by New Contributor
  • 6716 Views
  • 1 replies
  • 0 kudos

Spark UI SQL/Dataframe tab missing queries

Hi,   Recently, I am having some problems viewing the query plans in the Spark UI SQL/Dataframe tab.   I would expect to see large query plans in the SQL tab where we can observe the details of the query such as the rows read/written/shuffled. Howeve...

image (10) image (7) (1)
  • 6716 Views
  • 1 replies
  • 0 kudos
Latest Reply
Anonymous
Not applicable
  • 0 kudos

@Koray Beyaz​ :This issue may be related to a change in the default behavior of the Spark UI in recent versions of Databricks Runtime. In earlier versions, the Spark UI would display the full query plan for SQL and DataFrame operations in the SQL/Dat...

  • 0 kudos
ManuShell
by New Contributor III
  • 10912 Views
  • 10 replies
  • 7 kudos

DataBricks SQL: ODBC url to connect to DataBricks SQL tables

Hello Everyone,   We are trying to connect DataBricks SQL warehouse using ODBC url but we are not able to do it. We can see only JDBC url in connection details which works fine.   Was anyone able to connect using ODBC url? Can someone please help?

  • 10912 Views
  • 10 replies
  • 7 kudos
Latest Reply
Sumit_Kumar
New Contributor III
  • 7 kudos

Hi,Can you please try to connect once after disabling VPN in your local system (if its enabled)?Thanks.

  • 7 kudos
9 More Replies