cancel
Showing results for 
Search instead for 
Did you mean: 
Warehousing & Analytics
Engage in discussions on data warehousing, analytics, and BI solutions within the Databricks Community. Share insights, tips, and best practices for leveraging data for informed decision-making.
cancel
Showing results for 
Search instead for 
Did you mean: 
Data + AI Summit 2024 - Data Warehousing, Analytics, and BI

Forum Posts

MadelynM
by Databricks Employee
  • 943 Views
  • 0 replies
  • 0 kudos

[Recap] Data + AI Summit 2024 - Warehousing & Analytics | Improve performance and increase insights

Here's your Data + AI Summit 2024 - Warehousing & Analytics recap as you use intelligent data warehousing to improve performance and increase your organization’s productivity with analytics, dashboards and insights.  Keynote: Data Warehouse presente...

Screenshot 2024-07-03 at 10.15.26 AM.png
Warehousing & Analytics
AI BI Dashboards
AI BI Genie
Databricks SQL
  • 943 Views
  • 0 replies
  • 0 kudos
scrimpton
by New Contributor II
  • 1628 Views
  • 1 replies
  • 0 kudos

Resolved! Delta Sharing with Power BI

Using Delta Sharing connector with Power BI, does it only works for import and currently no support for direct query?

Warehousing & Analytics
DELTA SHARING
Power BI
  • 1628 Views
  • 1 replies
  • 0 kudos
Latest Reply
karthik_p
Esteemed Contributor
  • 0 kudos

@scrimpton currently it only supports import https://learn.microsoft.com/en-us/power-query/connectors/delta-sharing

  • 0 kudos
Datbth
by New Contributor
  • 605 Views
  • 0 replies
  • 0 kudos

Cancel SQL statement using ODBC driver

Hi,I'm implementing a Databricks connector using the ODBC driver and currently working on the functionality to Cancel an ongoing SQL statement.However, I can't seem to find any ODBC function or SQL function to do so.The only other alternative I see i...

  • 605 Views
  • 0 replies
  • 0 kudos
ckwan48
by New Contributor III
  • 6281 Views
  • 5 replies
  • 10 kudos

Trying to connect to DBeaver from Databricks and getting error message

I am trying to connect to DBeaver from Databricks and getting this error message:[Databricks][DatabricksJDBCDriver](500593) Communication link failure. Failed to connect to server. Reason: javax.net.ssl.SSLHandshakeException: PKIX path building faile...

  • 6281 Views
  • 5 replies
  • 10 kudos
Latest Reply
Hardy
New Contributor III
  • 10 kudos

I have the same issue after upgrading cluster to DBR 12.2. Working fine with DBR 10.4

  • 10 kudos
4 More Replies
Yahya24
by New Contributor III
  • 2912 Views
  • 2 replies
  • 1 kudos

Resolved! API Query

Hello,I created a sql warehouse (cluster size = 2X-Small) and I wanted to use it to execute a query using the sql query api:- url : https://databricks-host/api/2.0/preview/sql/statements- params = {'warehouse_id': 'warehouse_id','statement': 'SELECT ...

  • 2912 Views
  • 2 replies
  • 1 kudos
Latest Reply
karthik_p
Esteemed Contributor
  • 1 kudos

@Yahya24 can you please remove preview in query, they are not in preview any more "/api/2.0/sql/statements/", you should see json response, can you please check drop down menu and change to json, some times it may be setted into text, but usual respo...

  • 1 kudos
1 More Replies
gmiguel
by Contributor
  • 3907 Views
  • 2 replies
  • 2 kudos

Resolved! Does "Merge Into" skip files when reading target table to find files to be touched?

I've been doing some testing with Partitions vs Z-Ordering to optimize the merge process.As the documentation says, tables smaller than 1TB should not be partitioned and can benefit from the Z-Ordering process to optimize the reading process.Analyzin...

  • 3907 Views
  • 2 replies
  • 2 kudos
Latest Reply
" src="" />
This widget could not be displayed.
This widget could not be displayed.
This widget could not be displayed.
  • 2 kudos

This widget could not be displayed.
I've been doing some testing with Partitions vs Z-Ordering to optimize the merge process.As the documentation says, tables smaller than 1TB should not be partitioned and can benefit from the Z-Ordering process to optimize the reading process.Analyzin...

This widget could not be displayed.
  • 2 kudos
This widget could not be displayed.
1 More Replies
Mswedorske
by New Contributor II
  • 1781 Views
  • 1 replies
  • 2 kudos

Resolved! Historical Reporting

How do you handle reporting monthly trends within a data lakehouse?  Can this be done with timetravel to get the table state at the end of each month or is it better practice to build a data warehouse with SCD types?  We are new to databricks and lak...

  • 1781 Views
  • 1 replies
  • 2 kudos
Latest Reply
daniel_sahal
Esteemed Contributor
  • 2 kudos

@Mswedorske IMO it would be better to use SCD.When you do VACUUM on a table, it removes the data files that are necessary for Time Travel, so it's not a best choice to rely on Time Travel.

  • 2 kudos
BamBam
by New Contributor II
  • 1971 Views
  • 1 replies
  • 0 kudos

Where are driver logs for SQL Pro Warehouse?

In an All-Purpose Cluster, it is pretty easy to get at the Driver logs.  Where do I find the Driver Logs for a SQL Pro Warehouse?  The reason I ask is because sometimes in a SQL Editor we get generic error messages like "Task failed while writing row...

Warehousing & Analytics
SQLProWarehouse
  • 1971 Views
  • 1 replies
  • 0 kudos
Latest Reply
" src="" />
This widget could not be displayed.
This widget could not be displayed.
This widget could not be displayed.
  • 0 kudos

This widget could not be displayed.
In an All-Purpose Cluster, it is pretty easy to get at the Driver logs.  Where do I find the Driver Logs for a SQL Pro Warehouse?  The reason I ask is because sometimes in a SQL Editor we get generic error messages like "Task failed while writing row...

This widget could not be displayed.
  • 0 kudos
This widget could not be displayed.
Kaz
by New Contributor II
  • 3098 Views
  • 4 replies
  • 1 kudos

Automatically importing packages in notebooks

Within our team, there are certain (custom) python packages we always use and import in the same way. When starting a new notebook or analysis, we have to import these packages every time. Is it possible to automatically make these imports available ...

  • 3098 Views
  • 4 replies
  • 1 kudos
Latest Reply
Tharun-Kumar
Databricks Employee
  • 1 kudos

@Kaz  You can install these libraries using the Libraries section in the Compute.  All of the libraries mentioned here would be installed whenever the cluster is spun up.

  • 1 kudos
3 More Replies
Kaz
by New Contributor II
  • 5655 Views
  • 0 replies
  • 0 kudos

Show full logs on job log

Is it possible to show the full logs of a databricks job? Currently, the logs are skipped with:*** WARNING: max output size exceeded, skipping output. ***However, I don't believe our log files are more than 20 MB. I know you can press the logs button...

  • 5655 Views
  • 0 replies
  • 0 kudos
SimonMcCor
by New Contributor
  • 2397 Views
  • 1 replies
  • 1 kudos

Calculated Field in Dashboards

Is there a way to create a calculated field in a dashboard from the data that has been put into it?I have an aggregated dataset that goes into a dashboard, but using an average in the calculation will only work if I display the average by the grouped...

  • 2397 Views
  • 1 replies
  • 1 kudos
Latest Reply
" src="" />
This widget could not be displayed.
This widget could not be displayed.
This widget could not be displayed.
  • 1 kudos

This widget could not be displayed.
Is there a way to create a calculated field in a dashboard from the data that has been put into it?I have an aggregated dataset that goes into a dashboard, but using an average in the calculation will only work if I display the average by the grouped...

This widget could not be displayed.
  • 1 kudos
This widget could not be displayed.
Erik
by Valued Contributor III
  • 8308 Views
  • 0 replies
  • 0 kudos

Hot path event processing and serving in databricks

We have a setup where we process sensor data in databricks using pyspark structured streaming from kafka streams, and continuisly write these to delta tables. These delta tables are served through a SQL warehouse endpoint to the users. We also store ...

  • 8308 Views
  • 0 replies
  • 0 kudos
colinsorensen
by New Contributor III
  • 2297 Views
  • 0 replies
  • 0 kudos

Unhandled error while executing ['DatabricksSQLCursorWrapper' object has no attribute 'fetchmany'

Getting this error in dbt when trying to run a query. Not happening in the actual SQL warehouse in Databricks. Is this a bug? Can only find source code when I search 'DatabricksSQLCursorWrapper' but no documentation or information otherwise.

  • 2297 Views
  • 0 replies
  • 0 kudos
hehuan-yu-zen
by New Contributor II
  • 1620 Views
  • 1 replies
  • 0 kudos

customise the dates showing in the calendar selection in sql editor/dashboard

Does anybody know whether we could customise the dates showing in the calendar selection in sql editor/dashboard?My query has a time frame in a particular period, however when I use DateRange parameter in sql editor, it could allow users to choose th...

  • 1620 Views
  • 1 replies
  • 0 kudos
Latest Reply
" src="" />
This widget could not be displayed.
This widget could not be displayed.
This widget could not be displayed.
  • 0 kudos

This widget could not be displayed.
Does anybody know whether we could customise the dates showing in the calendar selection in sql editor/dashboard?My query has a time frame in a particular period, however when I use DateRange parameter in sql editor, it could allow users to choose th...

This widget could not be displayed.
  • 0 kudos
This widget could not be displayed.

Connect with Databricks Users in Your Area

Join a Regional User Group to connect with local Databricks users. Events will be happening in your city, and you won’t want to miss the chance to attend and share knowledge.

If there isn’t a group near you, start one and help create a community that brings people together.

Request a New Group
Labels
Top Kudoed Authors