cancel
Showing results for 
Search instead for 
Did you mean: 
Warehousing & Analytics
Engage in discussions on data warehousing, analytics, and BI solutions within the Databricks Community. Share insights, tips, and best practices for leveraging data for informed decision-making.
cancel
Showing results for 
Search instead for 
Did you mean: 

Forum Posts

MadelynM
by Databricks Employee
  • 3127 Views
  • 0 replies
  • 0 kudos

[Recap] Data + AI Summit 2024 - Warehousing & Analytics | Improve performance and increase insights

Here's your Data + AI Summit 2024 - Warehousing & Analytics recap as you use intelligent data warehousing to improve performance and increase your organization’s productivity with analytics, dashboards and insights.  Keynote: Data Warehouse presente...

Screenshot 2024-07-03 at 10.15.26 AM.png
Warehousing & Analytics
AI BI Dashboards
AI BI Genie
Databricks SQL
  • 3127 Views
  • 0 replies
  • 0 kudos
Ramakrishnan83
by New Contributor III
  • 2491 Views
  • 1 replies
  • 1 kudos

Intermittent SQL Failure on Databricks SQL Warehouse

Team,I did setup a SQL Warehouse Cluster to support request from Mobile devices through REST API. I read through the documentation of concurrent query limit which is 10. But in my scenario I had 5 small clusters and the query monitoring indicated the...

  • 2491 Views
  • 1 replies
  • 1 kudos
Latest Reply
198270
New Contributor II
  • 1 kudos

We have a similar problem: our self service BI tool Looker is using SQL warehouse and queries that usually run in a few seconds randomly fail with this message (logged in Looker history explore) "Java::JavaSql::SQLException: [Databricks][JDBCDriver](...

  • 1 kudos
noimeta
by Contributor III
  • 13749 Views
  • 1 replies
  • 0 kudos

Resolved! Parameter section missing in AI/BI Dashboard

Hi,I'm trying to individually parameterize visualization widgets by following this tutorial https://docs.databricks.com/en/dashboards/parameters.html#static-widget-parametersHowever, it seems the Parameter section is missing. Does anyone facing the s...

  • 13749 Views
  • 1 replies
  • 0 kudos
Latest Reply
noimeta
Contributor III
  • 0 kudos

I have found a solution. I have to toggle the show filters button, then the Parameter section will show up.

  • 0 kudos
Artem_Y
by Databricks Employee
  • 1093 Views
  • 0 replies
  • 2 kudos

How to make a sparkline in Databricks dashboards and visualizations

In this post, we'll examine one approach to creating a sparkline in a Databricks Dashboard table. Approach As of writing this post, there is no built-in method of creating sparklines in a table, so we need to explore some workarounds. All workarounds...

Artem_Yevtushen_1-1729030562901.png Artem_Yevtushen_0-1729030495109.png Artem_Yevtushen_2-1729031894756.png
Warehousing & Analytics
bi
dashboard
Visualization
  • 1093 Views
  • 0 replies
  • 2 kudos
prasadvaze
by Valued Contributor II
  • 13506 Views
  • 11 replies
  • 10 kudos

Azure Synapse versus databricks SQL endpoint performance comparison

Has anyone done this and share details? I have a sample sql which ran on large SQL endpoint in 8min and synapse 1000DWU setting in 1hr. On small SQL endpoint it took 34min. What's the equivalent SQL Endpoint compute for Synapse@1000DWU? I know there ...

  • 13506 Views
  • 11 replies
  • 10 kudos
Latest Reply
arslanapk99
New Contributor II
  • 10 kudos

the cost vary 

  • 10 kudos
10 More Replies
cristianc
by Contributor
  • 1944 Views
  • 2 replies
  • 0 kudos

How does the refresh work for AI/BI (formerly LakeView Dashboards)

Greetings,I'm writing this message because I want to understand how does the "automatic refresh" feature work for AI/BI dashboards that use SQL Serverless endpoints?I'm asking because sometimes the published dashboard refreshes when viewing the link ...

  • 1944 Views
  • 2 replies
  • 0 kudos
Latest Reply
Walter_C
Databricks Employee
  • 0 kudos

The dashboard will refresh under conditions such as Manual refresh, scheduled refresh or if a parameter in the refresh changes it will hit the refresh option

  • 0 kudos
1 More Replies
151640
by New Contributor III
  • 1226 Views
  • 2 replies
  • 0 kudos

Databricks JDBC driver. Databasemetadata.getColumns does not return columns of VARIANT type

Resultset returned by DatabaseMetadata.getColumns does not include the variant column in a table. Only includes the non-variant column.Databricks JDBC driver 02.06.40.1071create table tvariant(rnum int, c1 variant);

  • 1226 Views
  • 2 replies
  • 0 kudos
Latest Reply
filipniziol
Esteemed Contributor
  • 0 kudos

Hi @151640 ,According to the documentation VARIANT data type in not supported by Databricks JDBC driver.Here is the list of supported data types: 

  • 0 kudos
1 More Replies
andre_rizzatti
by New Contributor II
  • 3306 Views
  • 3 replies
  • 2 kudos

SQLWarehouse Case INsensitive

Good morning, is there any parameter or configuration that causes all my data to be consulted without case distinction? insensitive?

  • 3306 Views
  • 3 replies
  • 2 kudos
Latest Reply
MarianoRanu
New Contributor II
  • 2 kudos

Hi @raphaelblg ,do you know any update to this or any workaround?Regards,Mariano

  • 2 kudos
2 More Replies
techuser
by New Contributor III
  • 11741 Views
  • 6 replies
  • 1 kudos

Databricks Liquid Cluster

Hi,Is it possible to convert existing delta table with partition having data to clustering? If so can you please suggest the steps required? I tried and searched but couldn't find any. Is it that liquid clustering can be done only for new Delta table...

  • 11741 Views
  • 6 replies
  • 1 kudos
Latest Reply
Raja_Databricks
New Contributor III
  • 1 kudos

Does Liquid Clustering accepts Merge or How Upsert can be done efficiently with Liquid clustered delta table

  • 1 kudos
5 More Replies
Pavan3
by New Contributor II
  • 1025 Views
  • 2 replies
  • 0 kudos

Regarding Database location in dbfs

Hi,I have used "SET spark.sql.warehouse.dir" which creates the directory by default.Then I have created the database by command "CREATE DATABASE IF NOT EXISTS database_name;",but when I used "DESCRIBE DATABASE database_name" I could not find the loca...

  • 1025 Views
  • 2 replies
  • 0 kudos
Latest Reply
filipniziol
Esteemed Contributor
  • 0 kudos

Hi @Pavan3 ,If running DESCRIBE DATABASE the location is empty, then was created in the default CATALOG directory.What you can do is to create any table in that database and run DESCRIBE DETAIL on that table:Hope it helps

  • 0 kudos
1 More Replies
apiury
by New Contributor III
  • 1488 Views
  • 1 replies
  • 0 kudos

Connect NET app to delta table warehouse

Hi! I'm developing a .NET app and i want to use the databricks warehouse as database. I have gold delta tables that i want to query. In the documentation, i can see a ODBC/JDBC driver, are those connector fast? there are another way to connect? what ...

  • 1488 Views
  • 1 replies
  • 0 kudos
Latest Reply
rangu
New Contributor III
  • 0 kudos

We have been using .Net apps connected to Databricks delta tables through Clusters, we have been using ODBC  to achieve this. However we recently hit a roadblock after UC migration, where the UC all purpose cluster started giving issues with queries ...

  • 0 kudos
Aya-Ahmed
by New Contributor II
  • 1752 Views
  • 2 replies
  • 0 kudos

Parquet Encryption/Decryption in Databricks

Hi everyone,I'm curious about Databricks' approach to encrypting and decrypting Parquet files. Does Databricks adhere to standard encryption/decryption methods for Parquet? If not, what specific methods or techniques are used?I'd appreciate any insig...

  • 1752 Views
  • 2 replies
  • 0 kudos
Latest Reply
Witold
Honored Contributor
  • 0 kudos

Since Databricks uses Spark, you should be able to use e.g. Columnar EncryptionBesides, you can look into this and aes-specific functions.

  • 0 kudos
1 More Replies
AndrejZ
by New Contributor
  • 1552 Views
  • 1 replies
  • 0 kudos

Shared Parameters between queries on a dashboard

I would like to create a simple governance dashboard with multiple queries (a query to see user login events, a query to see sql statements ran, a query for jobs executed, etc.)What i would like to do is have a single user name parameter which would ...

  • 1552 Views
  • 1 replies
  • 0 kudos
Latest Reply
Walter_C
Databricks Employee
  • 0 kudos

Yes you can set dashboard parameters so you provide the username on the parameter or widget and it gets distributed to the different queries https://docs.databricks.com/en/dashboards/parameters.html  

  • 0 kudos
qwerty3
by Contributor
  • 5856 Views
  • 21 replies
  • 3 kudos

Spark dataframe performing poorly

I have huge datasets, transformation, display, print, show are working well on this data when read in a pandas dataframe. But the same dataframe when converted to a spark dataframe, is taking minutes to display even a single row and hours to write th...

  • 5856 Views
  • 21 replies
  • 3 kudos
Latest Reply
gchandra
Databricks Employee
  • 3 kudos

I understand you want it sooner. Did it at least write the data in 10 minutes compared to not writing before? There are more knobs you can tweak like  spark.sql.shuffle.partitions=auto Do you have any index columns in your spatial data that can be us...

  • 3 kudos
20 More Replies
qwerty3
by Contributor
  • 1329 Views
  • 3 replies
  • 0 kudos

Unable to obtain count of dataframe

I am unable to obtain a count of a dataframe, it always get stuck at 1 stage, I have tried reducing the size, what can be the issue? How can I read cluster logs to identify the issue? 

  • 1329 Views
  • 3 replies
  • 0 kudos
Latest Reply
qwerty3
Contributor
  • 0 kudos

Driver memory is good enough, it is able to handle 90 lakhs data, what I am giving it is definitely less than that, what can I do about skewed data and shuffling?

  • 0 kudos
2 More Replies
JS_L
by New Contributor II
  • 1581 Views
  • 2 replies
  • 1 kudos

ERROR: key not found in SQL when trying to pass the result of a CTE as a function parameter

Hi Community,I try to pass the result of a CTE as a function parameter as code below WITH t1 AS ( SELECT array_join(collect_list(output), ',') AS x FROM my_catalog.my_db.get_x(:startTime, :endTime) ) SELECT 'AM_offline' as Type, CASE WHEN off...

  • 1581 Views
  • 2 replies
  • 1 kudos
Latest Reply
JS_L
New Contributor II
  • 1 kudos

Hi @szymon_dybczak Thanks for replying. I don't the issue is related to datatype, since the query works if I pass the subquery to _x parameter without CTE.Please see as below code:SELECT 'AM_offline' as Type, CASE WHEN offline_ratio > 1.5 THEN 'no-Go...

  • 1 kudos
1 More Replies