cancel
Showing results for 
Search instead for 
Did you mean: 
Warehousing & Analytics
Engage in discussions on data warehousing, analytics, and BI solutions within the Databricks Community. Share insights, tips, and best practices for leveraging data for informed decision-making.
cancel
Showing results for 
Search instead for 
Did you mean: 

Forum Posts

MadelynM
by Databricks Employee
  • 2904 Views
  • 0 replies
  • 0 kudos

[Recap] Data + AI Summit 2024 - Warehousing & Analytics | Improve performance and increase insights

Here's your Data + AI Summit 2024 - Warehousing & Analytics recap as you use intelligent data warehousing to improve performance and increase your organization’s productivity with analytics, dashboards and insights.  Keynote: Data Warehouse presente...

Screenshot 2024-07-03 at 10.15.26 AM.png
Warehousing & Analytics
AI BI Dashboards
AI BI Genie
Databricks SQL
  • 2904 Views
  • 0 replies
  • 0 kudos
dbph
by New Contributor II
  • 1926 Views
  • 1 replies
  • 1 kudos

How to use Serverless as DBT-CLI compute?

Hi,we'd like to use serverless as the compute for DBT-CLI (of course we already used Serverless SQL before) in a DBT workflow.I configured a normal DBT-task and tried to run a dbt-run command, which i previously tested sucessfully on my local machine...

  • 1926 Views
  • 1 replies
  • 1 kudos
Latest Reply
dbph
New Contributor II
  • 1 kudos

Hi, thanks for your help!Unfortunately our problem is not the Databricks-CLI we're using on local machines. The problem is the DBT-CLI which we are trying to run on serverless compute inside a Databricks-Workflow.I already tried adding the code you p...

  • 1 kudos
Akshay_Petkar
by Contributor III
  • 1608 Views
  • 2 replies
  • 0 kudos

Databricks External Table (ADLS) Access with Power BI?

Can we to connect Power BI directly to an Azure Databricks external table located in ADLS using cluster credentials?

  • 1608 Views
  • 2 replies
  • 0 kudos
Latest Reply
Wojciech_BUK
Valued Contributor III
  • 0 kudos

It depends- if you want to connect Power BI to table (wither external or managed) you can do it  BUT it will be through cluster (each time you run something on Power BI desktop - it will wake up cluster or SQL endpoint and you will access data in thi...

  • 0 kudos
1 More Replies
jakubk
by Contributor
  • 4179 Views
  • 2 replies
  • 3 kudos

Resolved! unity catalog information schema columns metadata out of sync with table - cant refresh

I'm using unity catalogI've changed the schema of my table by overwriting it with a newer file df.write \ .format('delta') \ .partitionBy(partitionColumn) \ .mode("overwrite") \ .option("overwriteSchema", "true") \ ...

  • 4179 Views
  • 2 replies
  • 3 kudos
Latest Reply
daniel_sahal
Esteemed Contributor
  • 3 kudos

@jakubk Try runningREPAIR TABLE <table_name> SYNC METADATA 

  • 3 kudos
1 More Replies
Shaimaa
by New Contributor II
  • 8945 Views
  • 2 replies
  • 0 kudos

Running SQL queries against a parquet folder in S3

I need to run sql queries against a parquet folder in S3. I am trying to use "read_files" but sometimes my queries fail due to errors while inferring the schema and sometimes without a specified reason. Sample query:  SELECT SUM(CASE WHEN match_resu...

  • 8945 Views
  • 2 replies
  • 0 kudos
Latest Reply
holly
Databricks Employee
  • 0 kudos

There's a few alternatives for you. 1. a switch in syntax - I doubt this will make much difference, but worth a shot SELECT ... FROM parquet.`s3://folder_path` 2. Create a view first then query against it. You should get better errors this way.  CRE...

  • 0 kudos
1 More Replies
acurus
by New Contributor II
  • 3559 Views
  • 2 replies
  • 2 kudos

Not able to escape `-` in external connected tablename

We are having some issues with getting data from some tables with the character `-` in their tablename.We are connected to the database with an SQL server connection, and the database is (as far as we know)  a Microsoft Azure SQL Database. We do not ...

  • 3559 Views
  • 2 replies
  • 2 kudos
Latest Reply
martinschou
New Contributor II
  • 2 kudos

Had the same issue when querying a table with the - character in the table name. Got the error: Incorrect syntax near '-'.Got the error on Databricks runtime version: 13.2 (includes Apache Spark 3.4.0, Scala 2.12)No error when using Databricks runtim...

  • 2 kudos
1 More Replies
ssequ
by New Contributor II
  • 2826 Views
  • 3 replies
  • 2 kudos

Resolved! Backspaces in Foreign Catalog Table Names

Hi,is there a way to import tables with UC disallowed characters when creating a foreign catalog? The database we are dealing with contains table names with backspaces and UC seems to completely ignore them when creating the catalog.Thanks for any he...

  • 2826 Views
  • 3 replies
  • 2 kudos
Latest Reply
daniel_sahal
Esteemed Contributor
  • 2 kudos

@ssequ I assume that you're refering to lakehouse federation? Unfortunately that's a limitation, tables with disallowed characters will be ignored.

  • 2 kudos
2 More Replies
vishva-fivetran
by New Contributor II
  • 3003 Views
  • 2 replies
  • 0 kudos

Databricks query on a warehouse failing with ` spark.driver.maxResultSize` error

we are trying to run a select * from one of our catalog tables. we are seeing the error : org.apache.hive.service.cli.HiveSQLException: Error running query: org.apache.spark.SparkException: Job aborted due to stage failure: Total size of serialized r...

  • 3003 Views
  • 2 replies
  • 0 kudos
Latest Reply
anardinelli
Databricks Employee
  • 0 kudos

Hello @vishva-fivetran how are you? To set the spark.driver.maxResultSize property, you can do so in the cluster Spark config. The property spark.driver.maxResultSize can be set to a value higher than the value reported in the exception message. For ...

  • 0 kudos
1 More Replies
Akshay_Petkar
by Contributor III
  • 2622 Views
  • 1 replies
  • 1 kudos

Resolved! Is Photon Enabled by Default for Warehouses in Databricks?

When creating a cluster in Databricks, I notice an option for enabling Photon. However, this option does not appear when configuring a warehouse. Is Photon enabled by default for warehouses in Databricks, or is there a different procedure to enable i...

  • 2622 Views
  • 1 replies
  • 1 kudos
Latest Reply
Yeshwanth
Databricks Employee
  • 1 kudos

@Akshay_Petkar good day! Photon is enabled by default for Databricks SQL warehouses. There is no need for a separate procedure to enable it for warehouses. For clusters, you have the option to manually enable or disable Photon by selecting the "Use P...

  • 1 kudos
DominikBraun
by New Contributor II
  • 2455 Views
  • 2 replies
  • 1 kudos

Resolved! SQL Warehouse: INVALID_PARAMETER_VALUE when starting

Hey everybody.When creating a SQL Warehouse and trying to start it, I get the following error message:Clusters are failing to launch. Cluster launch will be retried.Request to create a cluster failed with an exception: INVALID_PARAMETER_VALUE: Cannot...

  • 2455 Views
  • 2 replies
  • 1 kudos
Latest Reply
Ismael-K
Databricks Employee
  • 1 kudos

One suggestion would be if you could please have the workspace admin check the Data Access Configuration properties in the Workspace Settings for any secrets that may be held which the warehouse is trying to access on start up? These data access prop...

  • 1 kudos
1 More Replies
Akshay_Petkar
by Contributor III
  • 2071 Views
  • 1 replies
  • 0 kudos

Resolved! Databricks warehouse cost calculation

I would like to know how the cost for a warehouse in Databricks is calculated. Specifically, if the cost for a 2x small warehouse is 4 DBUs per hour, how is the cost determined if I use the warehouse for only 30 minutes and then terminate it? Will it...

  • 2071 Views
  • 1 replies
  • 0 kudos
Latest Reply
daniel_sahal
Esteemed Contributor
  • 0 kudos

@Akshay_Petkar As it states in Databricks documentation:Databricks offers you a pay-as-you-go approach with no up-front costs. Only pay for the products you use at per second granularity.It will charge 2 DBUs.

  • 0 kudos
Akshay_Petkar
by Contributor III
  • 1901 Views
  • 1 replies
  • 0 kudos

Facing Data Truncation Issues in Databricks Dashboards

I'm encountering data truncation in my Databricks dashboards. I'm working with a large dataset, and the dashboard only displays a limited number of (truncated) rows.let's take a dataset containing 1 million sales records. The dashboard currently only...

  • 1901 Views
  • 1 replies
  • 0 kudos
Latest Reply
NandiniN
Databricks Employee
  • 0 kudos

Hi @Akshay_Petkar , This is simple in Databricks SQL, just uncheck LIMIT 1000 in the drop down. https://docs.databricks.com/en/sql/get-started/visualize-data-tutorial.html Limit 1000 is selected by default for all queries to ensure that the query ret...

  • 0 kudos
diego_poggioli
by Contributor
  • 9398 Views
  • 5 replies
  • 2 kudos

Resolved! Nested subquery is not supported in the DELETE condition

According to the documentation the WHERE predicate in a DELETE statement should supports subqueries, including IN, NOT IN, EXISTS, NOT EXISTS, and scalar subqueries.if I try to run a query like:   DELETE FROM dev.gold.table AS trg WHERE EXISTS ( ...

  • 9398 Views
  • 5 replies
  • 2 kudos
Latest Reply
Tejas2022
New Contributor II
  • 2 kudos

@diego_poggioli Can you try selecting a 'year_month_version' column from the view instead of select * DELETE FROM dev.gold.table AS trg WHERE year_month_version IN (select year_month_version FROM v_distinct_year_month_version)  

  • 2 kudos
4 More Replies
Akshay_Petkar
by Contributor III
  • 3615 Views
  • 2 replies
  • 2 kudos

Resolved! Truncated Data on Lakeview Dashboard

When we create a Lakeview dashboard, the visuals show truncated data. I want to create a dashboard using the entire dataset because the charts do not display the exact values needed for accurate analysis.

  • 3615 Views
  • 2 replies
  • 2 kudos
Latest Reply
raphaelblg
Databricks Employee
  • 2 kudos

Hello @Akshay_Petkar , The FrontEnd (FE) rendering is a limit that determines what data we render on the FE that has been returned by the BackEnd (BE). Render all button allows you to render all the data on the FE that has been returned by the BE. Re...

  • 2 kudos
1 More Replies