cancel
Showing results for 
Search instead for 
Did you mean: 
Warehousing & Analytics
Engage in discussions on data warehousing, analytics, and BI solutions within the Databricks Community. Share insights, tips, and best practices for leveraging data for informed decision-making.
cancel
Showing results for 
Search instead for 
Did you mean: 

Forum Posts

MadelynM
by Databricks Employee
  • 2989 Views
  • 0 replies
  • 0 kudos

[Recap] Data + AI Summit 2024 - Warehousing & Analytics | Improve performance and increase insights

Here's your Data + AI Summit 2024 - Warehousing & Analytics recap as you use intelligent data warehousing to improve performance and increase your organization’s productivity with analytics, dashboards and insights.  Keynote: Data Warehouse presente...

Screenshot 2024-07-03 at 10.15.26 AM.png
Warehousing & Analytics
AI BI Dashboards
AI BI Genie
Databricks SQL
  • 2989 Views
  • 0 replies
  • 0 kudos
ogs
by New Contributor II
  • 891 Views
  • 3 replies
  • 2 kudos

Data selection from adls2 in Serverless Warehouse

Hi everyone,I'm trying to query data from our adls2 delta lake using a serverless sql warehouse. We've already set up private connectivity via NCC, but hitting a snag when running queries like:SELECT * FROM delta.`abfss://container@xxx.dfs.core.windo...

  • 891 Views
  • 3 replies
  • 2 kudos
Latest Reply
ogs
New Contributor II
  • 2 kudos

Hi, thanks for the detailed explanation.Unfortunately, configuring fs.azure.account.key in the Serverless advanced options didn’t help (I’m sure I wrote it correctly) - still receiving the same error.I saw in some sources around the net that I should...

  • 2 kudos
2 More Replies
singh_tushar_14
by New Contributor II
  • 669 Views
  • 4 replies
  • 1 kudos

Resolved! Datetime conversion on streaming tables

I am using streaming tables to read from multiple parquet files and create a table in my raw layer.Tech Stack -- dbt-databricksWhile loading I have a column called "ets" in my source which is nothing "milliseconds since Jan 1st 1970." I want this to ...

  • 669 Views
  • 4 replies
  • 1 kudos
Latest Reply
szymon_dybczak
Esteemed Contributor III
  • 1 kudos

Hi @singh_tushar_14 ,Since your ets columns already contains integer that represents milliseconds since Jan 1st 1970, can't you just use Spark SQL fuction? You don't need to add anything to 1970-01-01, since that information is already "encoded" in y...

  • 1 kudos
3 More Replies
lizou1
by New Contributor III
  • 628 Views
  • 2 replies
  • 1 kudos

serverless compute: cannot find column index for attribute on complex view

 serverless compute:cannot find column index for attribute error We are seeing this error if there are a complex view, the same view runs fine in sql warehousebut break in serverlesss compute onlylooks like serverless compute has issue parse large co...

  • 628 Views
  • 2 replies
  • 1 kudos
Latest Reply
lizou1
New Contributor III
  • 1 kudos

ok, I will test on latest version 4 and will let you know if any more issues, thanks

  • 1 kudos
1 More Replies
JPan
by New Contributor III
  • 729 Views
  • 4 replies
  • 5 kudos

Strange query error with 'trac%'

Hi All,I've encountered what seems like a very specific bug. For some reason, running a query with the following where clause results in an execution error.where lower(organization_name) like '%trac%'There's no execution error when trying "tra" and i...

  • 729 Views
  • 4 replies
  • 5 kudos
Latest Reply
BS_THE_ANALYST
Esteemed Contributor II
  • 5 kudos

@JPan I noticed you had the new SQL editor mode on in your video. For what it's worth, I flicked between both when testing and both were successful. Given that you're finding it's only happening on some fields, do you think this is to do with special...

  • 5 kudos
3 More Replies
bhanu_gautam
by Valued Contributor III
  • 522 Views
  • 3 replies
  • 2 kudos

Resolved! New Features in Dashboard

What are the new features we are going to see in the Dashboard this year?

  • 522 Views
  • 3 replies
  • 2 kudos
Latest Reply
BS_THE_ANALYST
Esteemed Contributor II
  • 2 kudos

@szymon_dybczak thanks for those suggestions. I didn't even know about the Product Roadmap Webinar every quarter. I'll definitely be tuning in for those.All the best,BS

  • 2 kudos
2 More Replies
tawarity
by New Contributor II
  • 9117 Views
  • 3 replies
  • 0 kudos

Python Requests Library Error ImportHookFinder.find_spec()

Hi All,I've been using notebooks to run patch requests to an external API using the Python requests library. Often times certain notebooks will randomly start to fail throughout the day and will raise a ImportHookFinder.find_spec() error when attempt...

  • 9117 Views
  • 3 replies
  • 0 kudos
Latest Reply
ChrisLawford_n1
Contributor
  • 0 kudos

Hey, Did you manage to solve this issue? I am experiencing the same error but under a different context of using pytest in a notebook.My scenario:import pytest import os import sys os.chdir(f'/Workspace/Shared/DataTechnology-Databricks-Core') # Skip...

  • 0 kudos
2 More Replies
vijamit
by New Contributor
  • 329 Views
  • 2 replies
  • 0 kudos

Photon running out of memory error while executing SQL on SQL Serverless warehouse

Photon ran out of memory while executing this query. Photon failed to reserve 32.0 KiB for BufferPool, in BroadcastBufferedRelation(spark_plan_id=13407). Memory usage: Total task memory (including non-Photon): 1829.7 MiB BroadcastBufferedRelation(spa...

  • 329 Views
  • 2 replies
  • 0 kudos
Latest Reply
Khaja_Zaffer
Contributor III
  • 0 kudos

Hello @vijamit Good day!!It was very hard to analyise the error but : The causes for the error shared are : Your query has run out of memory during execution, specifically when using the BuildHashedRelation and PartitionedRelation functions.Running o...

  • 0 kudos
1 More Replies
jtbing
by New Contributor
  • 743 Views
  • 2 replies
  • 3 kudos

Adding Tags to Saved Queries inNew SQL Editor

Using the old SQL editor, there was an option to add tags to saved queries under the "edit query info" option shown in the screenshots. Since starting to use the new editor, I noticed that option has disappeared from its previous location and I haven...

  • 743 Views
  • 2 replies
  • 3 kudos
Latest Reply
Advika
Databricks Employee
  • 3 kudos

Hello @jtbing!At the moment, adding or editing tags for saved queries isn’t available in the new SQL editor UI. There’s also an upcoming Bricktalks session with the Product Manager for Databricks SQL, introducing the new SQL editor; you may find it u...

  • 3 kudos
1 More Replies
rnai369
by New Contributor III
  • 1389 Views
  • 6 replies
  • 3 kudos

Resolved! Unable to create SQL warehouse

Hi All,I am getting below message when hovering on the Create SQL warehouse. I can only see the Serverless Starter Warehouse compute.You have reached the maximum number of SQL warehouses. Delete an existing warehouse to create new one.Regards,RN 

  • 1389 Views
  • 6 replies
  • 3 kudos
Latest Reply
rnai369
New Contributor III
  • 3 kudos

Thanks to all for your support.

  • 3 kudos
5 More Replies
deepakharin
by New Contributor
  • 492 Views
  • 3 replies
  • 0 kudos

Notebook using Assistant

I want to create notebooks for creating and accessing lakehouse . But I want it to be created using the assistant. Do we have a step by step guide to do it?

  • 492 Views
  • 3 replies
  • 0 kudos
Latest Reply
SebastianRowan
Contributor
  • 0 kudos

Open the assistant and ask it to spin up a lakehouse notebook and it should drop in the code you can run right away to connect and explore.  If you're thinking to download the notebook, make you download it in pdf format and rename it after download ...

  • 0 kudos
2 More Replies
dylan19
by New Contributor II
  • 433 Views
  • 3 replies
  • 0 kudos

How to open a SQL Warehouse query in SQL Warehouse?

How can you control whether a file is a Notebook or a Query?  Long story short - can you do anything to make Databricks recognise a .dbquery.ipynb file as a query and open it in SQL Warehouse?I created a file in my git folder as a query.  All good.  ...

  • 433 Views
  • 3 replies
  • 0 kudos
Latest Reply
Khaja_Zaffer
Contributor III
  • 0 kudos

Hello @dylan19 Good day!!Thats very frustrating! well just to check Did you change the .dbquery.ipynb file to a .sql file in your Git repository. For example, extract the SQL content from the .ipynb file (a JSON format) and saved it as plain text in ...

  • 0 kudos
2 More Replies
AndreasWagner
by New Contributor II
  • 6426 Views
  • 1 replies
  • 1 kudos

PowerBI connected to SAP Databricks

hi everyone, does somebody have experience with connecting PowerBI to SAP Databricks in the BDC? I have quite a few SAP customers interested in that ... many thanks, Andreas

  • 6426 Views
  • 1 replies
  • 1 kudos
Latest Reply
WiliamRosa
Contributor
  • 1 kudos

Hi @AndreasWagner, I don’t work with SAP, but I found this official material — maybe it helps you:https://www.databricks.com/blog/introducing-sap-databrickshttps://www.databricks.com/product/saphttps://www.databricks.com/resources/analyst-research/ma...

  • 1 kudos
gmont4m
by New Contributor
  • 334 Views
  • 1 replies
  • 1 kudos

Shared Compute - sys path not working

We were working with Python scripts on serverless (Git folder). The following code worked when we were calling on custom modules in sitting in the python scripts (eg. preprocess data from preprocessing.py). We've moved to shared compute but can no lo...

  • 334 Views
  • 1 replies
  • 1 kudos
Latest Reply
nayan_wylde
Honored Contributor III
  • 1 kudos

It seems you are using a workspace path. To confirm are you using a dedicated compute with access provided to a group. You need to provide can mange access to workspace path to the group.

  • 1 kudos
Andrea_
by New Contributor
  • 295 Views
  • 1 replies
  • 0 kudos

Change the default font for plots

Hi community, As the title says, I have been checking the documentation, but I could not find clear references to my question.I am working in a project, and I need to change all the plots fonts to match a specific brand. All plots are generated with ...

  • 295 Views
  • 1 replies
  • 0 kudos
Latest Reply
SebastianRowan
Contributor
  • 0 kudos

Set plt.rcParams['font.family'] = 'YourBrandFontName' at the start and all your plots will follow that font automatically, most probably.

  • 0 kudos
Akshay_Petkar
by Valued Contributor
  • 4360 Views
  • 9 replies
  • 2 kudos

How to Display Top Categories in Databricks AI/BI Dashboard?

In a Databricks AI/BI dashboard, I have a field with multiple categories (e.g., district-wise sales with 50 districts). How can I display only the top few categories (like the top 10) based on a specific metric such as sales?

  • 4360 Views
  • 9 replies
  • 2 kudos
Latest Reply
BS_THE_ANALYST
Esteemed Contributor II
  • 2 kudos

@GunaR is this not where we'd want to leverage a parameter? And then pair this with a filter on the dashboard? Perhaps you create an aggregate SQL query for the various metrics If you can't use a parameter in the ORDER BY clause, which you likely can...

  • 2 kudos
8 More Replies