cancel
Showing results for 
Search instead for 
Did you mean: 
Warehousing & Analytics
Engage in discussions on data warehousing, analytics, and BI solutions within the Databricks Community. Share insights, tips, and best practices for leveraging data for informed decision-making.
cancel
Showing results for 
Search instead for 
Did you mean: 

Forum Posts

MadelynM
by Databricks Employee
  • 1108 Views
  • 0 replies
  • 0 kudos

[Recap] Data + AI Summit 2024 - Warehousing & Analytics | Improve performance and increase insights

Here's your Data + AI Summit 2024 - Warehousing & Analytics recap as you use intelligent data warehousing to improve performance and increase your organization’s productivity with analytics, dashboards and insights.  Keynote: Data Warehouse presente...

Screenshot 2024-07-03 at 10.15.26 AM.png
Warehousing & Analytics
AI BI Dashboards
AI BI Genie
Databricks SQL
  • 1108 Views
  • 0 replies
  • 0 kudos
Akshay_Petkar
by Contributor II
  • 169 Views
  • 2 replies
  • 1 kudos

Databricks Dashboard Not Refreshing as per Schedule (Per Minute)

I have created a Ai/Bi dashboard on Databricks and set it to refresh every 1 minute. However, the dashboard is not refreshing as expected. Interestingly, when I terminate the warehouse, it triggers the warehouse but still does not refresh per minute ...

  • 169 Views
  • 2 replies
  • 1 kudos
Latest Reply
Akshay_Petkar
Contributor II
  • 1 kudos

Hi @Alberto_Umana ,I have followed the correct approach to create and schedule the dashboard for a 1-minute refresh, but it's not updating every minute as expected.I have attached images for your reference. Please take a look.

  • 1 kudos
1 More Replies
onlyme
by New Contributor II
  • 202 Views
  • 2 replies
  • 1 kudos

Resolved! Actions for warehouse channel update

Hello Let's say I create a SQL Warehouse on Current Channel (2024.40) and there is a new release (2024.50).Would I need to take some actions(reboot for example) so that my warehouse uses the 2024.50 version or it should run on the 2024.50 whenever th...

  • 202 Views
  • 2 replies
  • 1 kudos
Latest Reply
Isi
New Contributor III
  • 1 kudos

Hey @onlyme ,The Channel in Databricks SQL Warehouse has two options:1.Current: This corresponds to the latest stable version released by Databricks and updates automatically.2.Preview: Similar to a beta version, it includes improvements and new feat...

  • 1 kudos
1 More Replies
GarCab
by New Contributor
  • 166 Views
  • 1 replies
  • 0 kudos

Databricks connectivity issue with PBI service

Hello everyone,I created a report using PowerBI Desktop that I successfully connected to Databricks. However, in PBI Service, the visuals are not displayed and I'm asked to edit credentials of the semantic model. When doing so, I get the following er...

  • 166 Views
  • 1 replies
  • 0 kudos
Latest Reply
Alberto_Umana
Databricks Employee
  • 0 kudos

Hi @GarCab, Can you try with a PAT token, just to confirm it works? https://learn.microsoft.com/en-us/azure/databricks/partners/bi/power-bi#connect-power-bi-desktop-to-azure-databricks

  • 0 kudos
Hania__b
by New Contributor II
  • 201 Views
  • 3 replies
  • 2 kudos

Resolved! Access specific input item of For Each Tasks

Hi,I think I have a similar issue to the one in this post, but the answer isn't detailed enough for me.I have a list defined in my first task, which contains the items I want to iterate through [1,2,3,4]. When I use it as Inputs to the For Each frami...

  • 201 Views
  • 3 replies
  • 2 kudos
Latest Reply
Hania__b
New Contributor II
  • 2 kudos

Thank you both very much, I've nailed it . I have accepted Walter_C's answer as solution because Step 2 is what I was missing. Thanks MariuszK as well for your contribution.

  • 2 kudos
2 More Replies
alex456897874
by New Contributor II
  • 95 Views
  • 1 replies
  • 0 kudos

how to reuse filter between pages in dashboards?

Hi!I have a multi-page dashboard based on a handful of tables sharing a common run_id field.I created a dropdown filter widget on the first page to filter datasets based on the table's run_id field.It worked fine for this page, but how to reuse it fo...

  • 95 Views
  • 1 replies
  • 0 kudos
Latest Reply
Alberto_Umana
Databricks Employee
  • 0 kudos

Hello @alex456897874, can you try this approach?   Create a Global Parameter: Navigate to the Data tab of your dashboard.Define a new parameter based on the run_id field. This parameter will be used globally across all pages. Configure the Filter W...

  • 0 kudos
DEShoaib
by New Contributor II
  • 456 Views
  • 6 replies
  • 1 kudos

Fabric with Databricks

Do we have same functionality if we use Databricks with Fabric as it provides with Azure?

  • 456 Views
  • 6 replies
  • 1 kudos
Latest Reply
MariuszK
Contributor
  • 1 kudos

hi @DEShoaib Are you planing to move Dedicated (T-SQL) pool or Spark code?With Databricks you can replicated all features from Azure Synapse, you have possibility to use PySpark and Databricks SQL. MS Fabric has nice integration with Power BI and eas...

  • 1 kudos
5 More Replies
NS2
by New Contributor II
  • 234 Views
  • 3 replies
  • 3 kudos

Resolved! Databricks SQL connector for python

For Databricks SQL connector for python, the list of fields returned by Cursor.columns() is listed in here (like TABLE_CAT,  TABLE_SCHEM, TABLE_NAME,  COLUMN_NAME). Could someone please share an exhaustive list of fields (including short description ...

  • 234 Views
  • 3 replies
  • 3 kudos
Latest Reply
Satyadeepak
Databricks Employee
  • 3 kudos

All those fields are explained in the doc. For example search with `TABLE_CAT`

  • 3 kudos
2 More Replies
vidya_kothavale
by New Contributor III
  • 366 Views
  • 2 replies
  • 3 kudos

Resolved! Issue with MongoDB Spark Connector in Databricks

 I followed the official Databricks documentation("https://docs.databricks.com/en/_extras/notebooks/source/mongodb.html")to integrate MongoDB Atlas with Spark by setting up the MongoDB Spark Connector and configuring the connection string in my Datab...

  • 366 Views
  • 2 replies
  • 3 kudos
Latest Reply
szymon_dybczak
Esteemed Contributor III
  • 3 kudos

Hi @vidya_kothavale ,Could you try to change "spark.mongodb.input.uri" to following?spark.read.format("mongodb").option("spark.mongodb.read.connection.uri" 

  • 3 kudos
1 More Replies
boitumelodikoko
by Contributor
  • 534 Views
  • 9 replies
  • 6 kudos

Resolved! Assistance Needed: Issues with Databricks SQL Queries and Performance

Hi everyone,I hope you're all doing well.I'm experiencing some challenges with Databricks SQL, and I wanted to reach out to see if others have encountered similar issues or have suggestions for troubleshooting. Below is a summary of the problems I'm ...

  • 534 Views
  • 9 replies
  • 6 kudos
Latest Reply
boitumelodikoko
Contributor
  • 6 kudos

Hi @Walter_C,Thank you for your input and support regarding the challenges I’ve been experiencing with Databricks SQL.I followed up with support, and they confirmed that these are known issues currently under review. Here’s a summary of the response:...

  • 6 kudos
8 More Replies
VCA50380
by New Contributor III
  • 157 Views
  • 2 replies
  • 0 kudos

Equivalent of Oracle's CLOB in Databricks

Dear all,(migrating for an on-premise Oracle ...)The question is in the subject: "What is the equivalent of Oracle's CLOB in Databricks" ?I saw that the "string" type can go up to 50 thousands characters, which is quite good in most of our cases, but...

  • 157 Views
  • 2 replies
  • 0 kudos
Latest Reply
VCA50380
New Contributor III
  • 0 kudos

Hello;Thanks for the answer.For the concatenation itself, it is not an issue.My question is "is Databricks supporting something bigger than the 'string' data-type" ? Thanks

  • 0 kudos
1 More Replies
EWhitley
by New Contributor III
  • 4609 Views
  • 3 replies
  • 3 kudos

Resolved! Retrieve task name within workflow task (notebook, python)?

Using workflows, is there a way to obtain the task name from within a task?EX: I have a workflow with a notebook task. From within that notebook task I would like to retrieve the task name so I can use it for a variety of purposes.Currently, we're re...

  • 4609 Views
  • 3 replies
  • 3 kudos
Latest Reply
ttamas
New Contributor III
  • 3 kudos

Hi @EWhitley,Would {{task.name}} help in getting the current task name?https://docs.databricks.com/en/workflows/jobs/parameter-value-references.htmlPass context about job runs into job t 

  • 3 kudos
2 More Replies
leo-machado
by New Contributor III
  • 445 Views
  • 5 replies
  • 2 kudos

(Big) Problem with SQL Warehouse Auto stop

Long story short, I'm not sure if this is an already known problem, but the Auto Stop feature on SQL Warehouses after minutes of inactivity is not working properly.We started using SQL Warehouses more aggressively this December when we scaled up one ...

image (2).png Screenshot 2025-01-02 at 10.31.27.png
  • 445 Views
  • 5 replies
  • 2 kudos
Latest Reply
pdiamond
New Contributor III
  • 2 kudos

Is this still being investigated by Databricks? I'm seeing similar behavior that's costing us a lot of money.

  • 2 kudos
4 More Replies
boitumelodikoko
by Contributor
  • 1180 Views
  • 2 replies
  • 0 kudos

Resolved! Internal Error During Spark SQL Phase Optimization – Possible Bug in Spark/Databricks Runtime

We are experiencing the following issues.Description:I encountered an issue while executing a Spark SQL query in Databricks, and it seems to be related to the query optimization phase. The error message suggests an internal bug within Spark or the Sp...

  • 1180 Views
  • 2 replies
  • 0 kudos
Latest Reply
boitumelodikoko
Contributor
  • 0 kudos

Update:Response from the Databricks Team.SymptomsInternal Error During Spark SQL Phase Optimization.CauseDataBricks PG Engineering team confirmed that this is indeed a bug in CASE WHEN optimization & they are working on the fix for this issue.Resolut...

  • 0 kudos
1 More Replies
iscpablogarcia
by New Contributor II
  • 133 Views
  • 1 replies
  • 2 kudos

How can i set the workflow status to Skipped?

I have a Python script workflow with 2 tasks: Task A and Task B.When task A has data, this is shared to Task B via createOrReplaceGlobalTempView with no issues.The goal is: When A has no data, skip the Task B and also set the workflow status to "Skip...

iscpablogarcia_0-1737152423551.png
  • 133 Views
  • 1 replies
  • 2 kudos
Latest Reply
Walter_C
Databricks Employee
  • 2 kudos

To achieve the goal of setting the workflow status to "Skipped" when Task A has no data, you can use the "Run if" conditional task type in Databricks Jobs. This allows you to specify conditionals for later tasks based on the outcome of other tasks.ht...

  • 2 kudos
daviddekoning
by New Contributor II
  • 135 Views
  • 1 replies
  • 1 kudos

Resolved! Container Service on Windows base container

I have some legacy software that only runs on Windows, but that can be driven via Python. Is it possible to set up compute resources that run Databricks Container Service on a windows base image, so that I can then add this legacy software and work w...

  • 135 Views
  • 1 replies
  • 1 kudos
Latest Reply
Walter_C
Databricks Employee
  • 1 kudos

Unfortunately this is not possible, as part of the requirements you need to use an Ubuntu image: https://docs.databricks.com/en/compute/custom-containers.html#option-2-build-your-own-docker-base 

  • 1 kudos

Connect with Databricks Users in Your Area

Join a Regional User Group to connect with local Databricks users. Events will be happening in your city, and you won’t want to miss the chance to attend and share knowledge.

If there isn’t a group near you, start one and help create a community that brings people together.

Request a New Group
Labels
Top Kudoed Authors