cancel
Showing results for 
Search instead for 
Did you mean: 
Warehousing & Analytics
Engage in discussions on data warehousing, analytics, and BI solutions within the Databricks Community. Share insights, tips, and best practices for leveraging data for informed decision-making.
cancel
Showing results for 
Search instead for 
Did you mean: 

Forum Posts

MadelynM
by Databricks Employee
  • 1375 Views
  • 0 replies
  • 0 kudos

[Recap] Data + AI Summit 2024 - Warehousing & Analytics | Improve performance and increase insights

Here's your Data + AI Summit 2024 - Warehousing & Analytics recap as you use intelligent data warehousing to improve performance and increase your organization’s productivity with analytics, dashboards and insights.  Keynote: Data Warehouse presente...

Screenshot 2024-07-03 at 10.15.26 AM.png
Warehousing & Analytics
AI BI Dashboards
AI BI Genie
Databricks SQL
  • 1375 Views
  • 0 replies
  • 0 kudos
Akshay_Petkar
by Contributor III
  • 1359 Views
  • 5 replies
  • 2 kudos

How to Display Top Categories in Databricks AI/BI Dashboard?

In a Databricks AI/BI dashboard, I have a field with multiple categories (e.g., district-wise sales with 50 districts). How can I display only the top few categories (like the top 10) based on a specific metric such as sales?

  • 1359 Views
  • 5 replies
  • 2 kudos
Latest Reply
Mo
Databricks Employee
  • 2 kudos

hey @migq2 , @maks  in the AI/BI dashboards in your data, add a limit parameter like:select all from my_table limit :limit_number to all your tables. when you're on canvas and adding visualizations, add a filter and create a parameter with single val...

  • 2 kudos
4 More Replies
Akshay_Petkar
by Contributor III
  • 247 Views
  • 1 replies
  • 0 kudos

How to get the size of selected rows in bytes using a single SQL query?

Hi all,I have a table named employee in Databricks. I ran the following query to filter out rows where the salary is greater than 25000.This query returns 10 rows. I want to find the size of these 10 rows in bytes, and I would like to calculate or re...

  • 247 Views
  • 1 replies
  • 0 kudos
Latest Reply
Alberto_Umana
Databricks Employee
  • 0 kudos

Hi @Akshay_Petkar, You can try with this query: SELECT SUM(LENGTH(CAST(employee.* AS STRING))) AS total_size_in_bytesFROM employeeWHERE salary > 25000;

  • 0 kudos
vidya_kothavale
by New Contributor III
  • 459 Views
  • 1 replies
  • 1 kudos

Resolved! Insufficient Permissions Error When Reading Data from S3 in Shared Databricks Compute

I am using a Shared Databricks Compute and trying to read data from an S3 bucket via an Instance Profile. However, I am encountering the following error: [INSUFFICIENT_PERMISSIONS] Insufficient privileges: User does not have permission SELECT on any ...

  • 459 Views
  • 1 replies
  • 1 kudos
Latest Reply
Ayushi_Suthar
Databricks Employee
  • 1 kudos

Hi @vidya_kothavale , Greetings! Can you please refer to this article and check if it helps you to resolve your issue : https://kb.databricks.com/en_US/data/user-does-not-have-permission-select-on-any-file Please note that these permissions are only ...

  • 1 kudos
Akshay_Petkar
by Contributor III
  • 324 Views
  • 2 replies
  • 1 kudos

Databricks Dashboard Not Refreshing as per Schedule (Per Minute)

I have created a Ai/Bi dashboard on Databricks and set it to refresh every 1 minute. However, the dashboard is not refreshing as expected. Interestingly, when I terminate the warehouse, it triggers the warehouse but still does not refresh per minute ...

  • 324 Views
  • 2 replies
  • 1 kudos
Latest Reply
Akshay_Petkar
Contributor III
  • 1 kudos

Hi @Alberto_Umana ,I have followed the correct approach to create and schedule the dashboard for a 1-minute refresh, but it's not updating every minute as expected.I have attached images for your reference. Please take a look.

  • 1 kudos
1 More Replies
onlyme
by New Contributor II
  • 370 Views
  • 2 replies
  • 1 kudos

Resolved! Actions for warehouse channel update

Hello Let's say I create a SQL Warehouse on Current Channel (2024.40) and there is a new release (2024.50).Would I need to take some actions(reboot for example) so that my warehouse uses the 2024.50 version or it should run on the 2024.50 whenever th...

  • 370 Views
  • 2 replies
  • 1 kudos
Latest Reply
Isi
Contributor
  • 1 kudos

Hey @onlyme ,The Channel in Databricks SQL Warehouse has two options:1.Current: This corresponds to the latest stable version released by Databricks and updates automatically.2.Preview: Similar to a beta version, it includes improvements and new feat...

  • 1 kudos
1 More Replies
GarCab
by New Contributor
  • 423 Views
  • 1 replies
  • 0 kudos

Databricks connectivity issue with PBI service

Hello everyone,I created a report using PowerBI Desktop that I successfully connected to Databricks. However, in PBI Service, the visuals are not displayed and I'm asked to edit credentials of the semantic model. When doing so, I get the following er...

  • 423 Views
  • 1 replies
  • 0 kudos
Latest Reply
Alberto_Umana
Databricks Employee
  • 0 kudos

Hi @GarCab, Can you try with a PAT token, just to confirm it works? https://learn.microsoft.com/en-us/azure/databricks/partners/bi/power-bi#connect-power-bi-desktop-to-azure-databricks

  • 0 kudos
Hania__b
by New Contributor II
  • 458 Views
  • 3 replies
  • 2 kudos

Resolved! Access specific input item of For Each Tasks

Hi,I think I have a similar issue to the one in this post, but the answer isn't detailed enough for me.I have a list defined in my first task, which contains the items I want to iterate through [1,2,3,4]. When I use it as Inputs to the For Each frami...

  • 458 Views
  • 3 replies
  • 2 kudos
Latest Reply
Hania__b
New Contributor II
  • 2 kudos

Thank you both very much, I've nailed it . I have accepted Walter_C's answer as solution because Step 2 is what I was missing. Thanks MariuszK as well for your contribution.

  • 2 kudos
2 More Replies
DEShoaib
by New Contributor II
  • 915 Views
  • 6 replies
  • 1 kudos

Fabric with Databricks

Do we have same functionality if we use Databricks with Fabric as it provides with Azure?

  • 915 Views
  • 6 replies
  • 1 kudos
Latest Reply
MariuszK
Contributor III
  • 1 kudos

hi @DEShoaib Are you planing to move Dedicated (T-SQL) pool or Spark code?With Databricks you can replicated all features from Azure Synapse, you have possibility to use PySpark and Databricks SQL. MS Fabric has nice integration with Power BI and eas...

  • 1 kudos
5 More Replies
NS2
by New Contributor II
  • 450 Views
  • 3 replies
  • 3 kudos

Resolved! Databricks SQL connector for python

For Databricks SQL connector for python, the list of fields returned by Cursor.columns() is listed in here (like TABLE_CAT,  TABLE_SCHEM, TABLE_NAME,  COLUMN_NAME). Could someone please share an exhaustive list of fields (including short description ...

  • 450 Views
  • 3 replies
  • 3 kudos
Latest Reply
Satyadeepak
Databricks Employee
  • 3 kudos

All those fields are explained in the doc. For example search with `TABLE_CAT`

  • 3 kudos
2 More Replies
boitumelodikoko
by Contributor
  • 1014 Views
  • 9 replies
  • 6 kudos

Resolved! Assistance Needed: Issues with Databricks SQL Queries and Performance

Hi everyone,I hope you're all doing well.I'm experiencing some challenges with Databricks SQL, and I wanted to reach out to see if others have encountered similar issues or have suggestions for troubleshooting. Below is a summary of the problems I'm ...

  • 1014 Views
  • 9 replies
  • 6 kudos
Latest Reply
boitumelodikoko
Contributor
  • 6 kudos

Hi @Walter_C,Thank you for your input and support regarding the challenges I’ve been experiencing with Databricks SQL.I followed up with support, and they confirmed that these are known issues currently under review. Here’s a summary of the response:...

  • 6 kudos
8 More Replies
EWhitley
by New Contributor III
  • 4951 Views
  • 3 replies
  • 3 kudos

Resolved! Retrieve task name within workflow task (notebook, python)?

Using workflows, is there a way to obtain the task name from within a task?EX: I have a workflow with a notebook task. From within that notebook task I would like to retrieve the task name so I can use it for a variety of purposes.Currently, we're re...

  • 4951 Views
  • 3 replies
  • 3 kudos
Latest Reply
ttamas
New Contributor III
  • 3 kudos

Hi @EWhitley,Would {{task.name}} help in getting the current task name?https://docs.databricks.com/en/workflows/jobs/parameter-value-references.htmlPass context about job runs into job t 

  • 3 kudos
2 More Replies
leo-machado
by New Contributor III
  • 749 Views
  • 5 replies
  • 2 kudos

(Big) Problem with SQL Warehouse Auto stop

Long story short, I'm not sure if this is an already known problem, but the Auto Stop feature on SQL Warehouses after minutes of inactivity is not working properly.We started using SQL Warehouses more aggressively this December when we scaled up one ...

image (2).png Screenshot 2025-01-02 at 10.31.27.png
  • 749 Views
  • 5 replies
  • 2 kudos
Latest Reply
pdiamond
New Contributor III
  • 2 kudos

Is this still being investigated by Databricks? I'm seeing similar behavior that's costing us a lot of money.

  • 2 kudos
4 More Replies
boitumelodikoko
by Contributor
  • 1946 Views
  • 2 replies
  • 0 kudos

Resolved! Internal Error During Spark SQL Phase Optimization – Possible Bug in Spark/Databricks Runtime

We are experiencing the following issues.Description:I encountered an issue while executing a Spark SQL query in Databricks, and it seems to be related to the query optimization phase. The error message suggests an internal bug within Spark or the Sp...

  • 1946 Views
  • 2 replies
  • 0 kudos
Latest Reply
boitumelodikoko
Contributor
  • 0 kudos

Update:Response from the Databricks Team.SymptomsInternal Error During Spark SQL Phase Optimization.CauseDataBricks PG Engineering team confirmed that this is indeed a bug in CASE WHEN optimization & they are working on the fix for this issue.Resolut...

  • 0 kudos
1 More Replies
iscpablogarcia
by New Contributor II
  • 305 Views
  • 1 replies
  • 2 kudos

How can i set the workflow status to Skipped?

I have a Python script workflow with 2 tasks: Task A and Task B.When task A has data, this is shared to Task B via createOrReplaceGlobalTempView with no issues.The goal is: When A has no data, skip the Task B and also set the workflow status to "Skip...

iscpablogarcia_0-1737152423551.png
  • 305 Views
  • 1 replies
  • 2 kudos
Latest Reply
Walter_C
Databricks Employee
  • 2 kudos

To achieve the goal of setting the workflow status to "Skipped" when Task A has no data, you can use the "Run if" conditional task type in Databricks Jobs. This allows you to specify conditionals for later tasks based on the outcome of other tasks.ht...

  • 2 kudos
daviddekoning
by New Contributor II
  • 253 Views
  • 1 replies
  • 1 kudos

Resolved! Container Service on Windows base container

I have some legacy software that only runs on Windows, but that can be driven via Python. Is it possible to set up compute resources that run Databricks Container Service on a windows base image, so that I can then add this legacy software and work w...

  • 253 Views
  • 1 replies
  • 1 kudos
Latest Reply
Walter_C
Databricks Employee
  • 1 kudos

Unfortunately this is not possible, as part of the requirements you need to use an Ubuntu image: https://docs.databricks.com/en/compute/custom-containers.html#option-2-build-your-own-docker-base 

  • 1 kudos