cancel
Showing results for 
Search instead for 
Did you mean: 
Warehousing & Analytics
Engage in discussions on data warehousing, analytics, and BI solutions within the Databricks Community. Share insights, tips, and best practices for leveraging data for informed decision-making.
cancel
Showing results for 
Search instead for 
Did you mean: 

Forum Posts

MadelynM
by Databricks Employee
  • 1064 Views
  • 0 replies
  • 0 kudos

[Recap] Data + AI Summit 2024 - Warehousing & Analytics | Improve performance and increase insights

Here's your Data + AI Summit 2024 - Warehousing & Analytics recap as you use intelligent data warehousing to improve performance and increase your organization’s productivity with analytics, dashboards and insights.  Keynote: Data Warehouse presente...

Screenshot 2024-07-03 at 10.15.26 AM.png
Warehousing & Analytics
AI BI Dashboards
AI BI Genie
Databricks SQL
  • 1064 Views
  • 0 replies
  • 0 kudos
Akshay_Petkar
by Contributor II
  • 339 Views
  • 6 replies
  • 7 kudos

How to Complete CDW Migration Best Practices Training Labs on Databricks Learning Portal?

I am going through the labs for the CDW Migration Best Practices training on the Databricks Learning Portal and scored 75%. However, it still shows as "In Progress."Can anyone explain the criteria for marking the labs as complete? Do I need a higher ...

  • 339 Views
  • 6 replies
  • 7 kudos
Latest Reply
vikram2406
New Contributor II
  • 7 kudos

Yes, it typically takes 10-15 minutes to generate a grade after checking the labs.

  • 7 kudos
5 More Replies
chari
by Contributor
  • 6232 Views
  • 3 replies
  • 2 kudos

What is databricks SQL, spark SQL and how are they different from MS SQL ?

Hello Databricks Community,I have a hard time understanding how is Databricks SQL different from microsoft SQL ? Also, why does databricks provide spark SQL ? If you direct me to a well-written webpage or document its of immense help!Thanks,

  • 6232 Views
  • 3 replies
  • 2 kudos
Latest Reply
Rahul_Saini
New Contributor II
  • 2 kudos

Databricks SQL and Spark SQL are built for distributed big data analytic. Databricks SQL is great for business intelligence tools and uses Delta Lake for efficient data storage. Spark SQL works with Spark's programming features for data processing. U...

  • 2 kudos
2 More Replies
nomnomnom543
by New Contributor II
  • 480 Views
  • 3 replies
  • 2 kudos

Databricks SQL Wildcard Operator Not Parsed Correctly

Hello there,Wasn't sure if this was just an error on my part, but I'm using a Databricks Pro SQL warehouse and unity catalogue to pull some data from my tables. I'm having this issue where whenever I try and use a wildcard operator with my LIKE claus...

  • 480 Views
  • 3 replies
  • 2 kudos
Latest Reply
Rahul_Saini
New Contributor II
  • 2 kudos

Hi @nomnomnom543 ,Try this SELECT * FROM table_name WHERE LEFT(column_name, LENGTH('string')) = 'string';

  • 2 kudos
2 More Replies
amelia1
by New Contributor II
  • 1448 Views
  • 1 replies
  • 0 kudos

Local pyspark read data using jdbc driver returns column names only

Hello,I have an Azure sql warehouse serverless instance that I can connect to using databricks-sql-connector. But, when I try to use pyspark and jdbc driver url, I can't read or write.See my code belowdef get_jdbc_url(): # Define your Databricks p...

  • 1448 Views
  • 1 replies
  • 0 kudos
Latest Reply
NandiniN
Databricks Employee
  • 0 kudos

The error does not look specific to the warehouse that you are connecting to. The error message "Unrecognized conversion specifier [msg] starting at position 54 in conversion pattern" indicates that there is an issue with the logging configuration in...

  • 0 kudos
OfirM
by New Contributor
  • 267 Views
  • 1 replies
  • 0 kudos

spark.databricks.optimizer.replaceWindowsWithAggregates.enabled

I have seen in the release notes of 15.3 that this was introduced and couldn't wrap my head around it.Does someone has an example of a plan before and after?Quote:Performance improvement for some window functionsThis release includes a change that im...

  • 267 Views
  • 1 replies
  • 0 kudos
Latest Reply
Walter_C
Databricks Employee
  • 0 kudos

Before Optimization: Consider a query that calculates the sum of a column value partitioned by category without an ORDER BY clause or a window_frame parameter:   SELECT category, SUM(value) OVER (PARTITION BY category) AS total_value FROM sales;  ...

  • 0 kudos
rk2511
by New Contributor
  • 410 Views
  • 1 replies
  • 0 kudos

Access Each Input Item of a For Each Task

I have two tasks. The first task (Sample_Notebook) returns a JSON array (Input_List). Sample data in Input_List['key1':value1, 'key2':value2, 'key3':value3]The second task is a "For Each" task that executes a notebook for each entry in the Input_List...

  • 410 Views
  • 1 replies
  • 0 kudos
Latest Reply
BigRoux
Databricks Employee
  • 0 kudos

To access each item of the iteration within the notebook of the second task in your Databricks workflow, you need to utilize the parameterization feature of the For Each task. Instead of trying to retrieve the entire list using dbutils.jobs.taskValue...

  • 0 kudos
AnaMocanu
by New Contributor III
  • 362 Views
  • 2 replies
  • 0 kudos

Streamlit app on Databricks doesn't recognise the DATABRICKS_WAREHOUSE_ID in the yaml file

Hey everyone,So I managed to create a Streamlit app on Databricks, works fine deployed in the cloud.However, when I try to run it locally, it complains about "assert os.getenv('DATABRICKS_WAREHOUSE_ID'), "DATABRICKS_WAREHOUSE_ID must be set in app.ya...

  • 362 Views
  • 2 replies
  • 0 kudos
Latest Reply
Alberto_Umana
Databricks Employee
  • 0 kudos

Hi @AnaMocanu, How are you setting up: DATABRICKS_WAREHOUSE_ID? When running the app locally, the environment variables set in the app.yaml file might not be picked up automatically. You can manually set the environment variable in your local environ...

  • 0 kudos
1 More Replies
hank12345
by New Contributor
  • 549 Views
  • 2 replies
  • 0 kudos

Resolved! Lakehouse federation support for Oracle DB

https://docs.databricks.com/en/query-federation/index.htmlAre there plans to provide Oracle support for Databricks on AWS lakehouse federation? Not sure if that's possible or not. Thanks!

  • 549 Views
  • 2 replies
  • 0 kudos
Latest Reply
PiotrU
Contributor II
  • 0 kudos

Federation with Oracle is available https://learn.microsoft.com/en-us/azure/databricks/query-federation/oracle

  • 0 kudos
1 More Replies
maoruales32
by New Contributor
  • 121 Views
  • 1 replies
  • 0 kudos

Point map datapoints labels

Point map visualization datapoints labels do not let me input a specific column on them

  • 121 Views
  • 1 replies
  • 0 kudos
Latest Reply
Walter_C
Databricks Employee
  • 0 kudos

Previously replied in https://community.databricks.com/t5/warehousing-analytics/datapoints-labels-on-a-point-map-visualization/td-p/101839 

  • 0 kudos
aburkh
by New Contributor
  • 254 Views
  • 1 replies
  • 0 kudos

User default timezone (SQL)

Users get confused when querying data with timestamps because UTC is not intuitive for many. It is possible to set TIME ZONE at query level or at SQL Warehouse level, but those options fail to address the need of multiple users working on the same wa...

  • 254 Views
  • 1 replies
  • 0 kudos
Latest Reply
Alberto_Umana
Databricks Employee
  • 0 kudos

it is possible to set the time zone at the session level using the SET TIME ZONE statement in Databricks SQL. This allows users to control the local timezone used for timestamp operations within their session. However, there is no direct option of us...

  • 0 kudos
atikiwala
by New Contributor
  • 519 Views
  • 1 replies
  • 0 kudos

Resolved! Working with Databricks Apps

I'm trying to use Databricks Apps to host a Streamlit app to serve some interactive applicaton. I face two limitations:1. In environment for App I see it using certain python version, but how to update it to use any other version?It is already set to...

atikiwala_0-1732008401891.png atikiwala_1-1732008777398.png
  • 519 Views
  • 1 replies
  • 0 kudos
Latest Reply
parthSundarka
Databricks Employee
  • 0 kudos

Hi @atikiwala , Good Day! Python 3.11 is currently the only version we support. We are thinking of adding additional options in the future. Would love to hear your feedback on this - https://docs.databricks.com/en/resources/ideas.html#submit-product-...

  • 0 kudos
Akshay_Petkar
by Contributor II
  • 877 Views
  • 4 replies
  • 2 kudos

How to Display Top Categories in Databricks AI/BI Dashboard?

In a Databricks AI/BI dashboard, I have a field with multiple categories (e.g., district-wise sales with 50 districts). How can I display only the top few categories (like the top 10) based on a specific metric such as sales?

  • 877 Views
  • 4 replies
  • 2 kudos
Latest Reply
Mo
Databricks Employee
  • 2 kudos

hey @migq2 , @maks  in the AI/BI dashboards in your data, add a limit parameter like:select all from my_table limit :limit_number to all your tables. when you're on canvas and adding visualizations, add a filter and create a parameter with single val...

  • 2 kudos
3 More Replies
igorstar
by New Contributor III
  • 3547 Views
  • 3 replies
  • 2 kudos

Resolved! What is the difference between LIVE TABLE and MATERIALIZED VIEW?

From the DLT documentation it seems that the LIVE TABLE is conceptually the same as MATERIALIZED VIEW. When should I use one over another?

  • 3547 Views
  • 3 replies
  • 2 kudos
Latest Reply
Mo
Databricks Employee
  • 2 kudos

@ImranA and @igorstar  I repost my response here again:to create materialized views, you could use CREATE OR REFRESH LIVE TABLE however according to the official docs: The CREATE OR REFRESH LIVE TABLE syntax to create a materialized view is deprecat...

  • 2 kudos
2 More Replies

Connect with Databricks Users in Your Area

Join a Regional User Group to connect with local Databricks users. Events will be happening in your city, and you won’t want to miss the chance to attend and share knowledge.

If there isn’t a group near you, start one and help create a community that brings people together.

Request a New Group
Labels
Top Kudoed Authors