cancel
Showing results for 
Search instead for 
Did you mean: 
Warehousing & Analytics
Engage in discussions on data warehousing, analytics, and BI solutions within the Databricks Community. Share insights, tips, and best practices for leveraging data for informed decision-making.
cancel
Showing results for 
Search instead for 
Did you mean: 

Forum Posts

MadelynM
by Databricks Employee
  • 3477 Views
  • 0 replies
  • 0 kudos

[Recap] Data + AI Summit 2024 - Warehousing & Analytics | Improve performance and increase insights

Here's your Data + AI Summit 2024 - Warehousing & Analytics recap as you use intelligent data warehousing to improve performance and increase your organization’s productivity with analytics, dashboards and insights.  Keynote: Data Warehouse presente...

Screenshot 2024-07-03 at 10.15.26 AM.png
Warehousing & Analytics
AI BI Dashboards
AI BI Genie
Databricks SQL
  • 3477 Views
  • 0 replies
  • 0 kudos
deepakharin
by New Contributor
  • 901 Views
  • 3 replies
  • 0 kudos

Notebook using Assistant

I want to create notebooks for creating and accessing lakehouse . But I want it to be created using the assistant. Do we have a step by step guide to do it?

  • 901 Views
  • 3 replies
  • 0 kudos
Latest Reply
SebastianRowan
Contributor
  • 0 kudos

Open the assistant and ask it to spin up a lakehouse notebook and it should drop in the code you can run right away to connect and explore.  If you're thinking to download the notebook, make you download it in pdf format and rename it after download ...

  • 0 kudos
2 More Replies
dylan19
by New Contributor II
  • 941 Views
  • 3 replies
  • 0 kudos

How to open a SQL Warehouse query in SQL Warehouse?

How can you control whether a file is a Notebook or a Query?  Long story short - can you do anything to make Databricks recognise a .dbquery.ipynb file as a query and open it in SQL Warehouse?I created a file in my git folder as a query.  All good.  ...

  • 941 Views
  • 3 replies
  • 0 kudos
Latest Reply
Khaja_Zaffer
Esteemed Contributor
  • 0 kudos

Hello @dylan19 Good day!!Thats very frustrating! well just to check Did you change the .dbquery.ipynb file to a .sql file in your Git repository. For example, extract the SQL content from the .ipynb file (a JSON format) and saved it as plain text in ...

  • 0 kudos
2 More Replies
gmont4m
by Databricks Partner
  • 574 Views
  • 1 replies
  • 1 kudos

Shared Compute - sys path not working

We were working with Python scripts on serverless (Git folder). The following code worked when we were calling on custom modules in sitting in the python scripts (eg. preprocess data from preprocessing.py). We've moved to shared compute but can no lo...

  • 574 Views
  • 1 replies
  • 1 kudos
Latest Reply
nayan_wylde
Esteemed Contributor II
  • 1 kudos

It seems you are using a workspace path. To confirm are you using a dedicated compute with access provided to a group. You need to provide can mange access to workspace path to the group.

  • 1 kudos
Andrea_
by New Contributor
  • 696 Views
  • 1 replies
  • 0 kudos

Change the default font for plots

Hi community, As the title says, I have been checking the documentation, but I could not find clear references to my question.I am working in a project, and I need to change all the plots fonts to match a specific brand. All plots are generated with ...

  • 696 Views
  • 1 replies
  • 0 kudos
Latest Reply
SebastianRowan
Contributor
  • 0 kudos

Set plt.rcParams['font.family'] = 'YourBrandFontName' at the start and all your plots will follow that font automatically, most probably.

  • 0 kudos
excavator-matt
by Contributor III
  • 1377 Views
  • 2 replies
  • 2 kudos

Resolved! Strange selection of districts for Finland in choropleth visualisation. Something to reconsider?

Hi!We're trying to use the choropleth visualisation in the new dashboard. This works well for all countries we tried except for district level of Finland.The supported districts are clearly stated in the CSV on the documentation page titled county-di...

Warehousing & Analytics
choropleth
dashboard
districts
finland
  • 1377 Views
  • 2 replies
  • 2 kudos
Latest Reply
Alex_Lichen
Databricks Employee
  • 2 kudos

Hi Matt,   You are correct that today, the Choropleth maps are using sub-regions as defined in https://en.wikipedia.org/wiki/Sub-regions_of_Finland. We are currently getting these boundaries from Mapbox, which provides this boundary data.   Definitel...

  • 2 kudos
1 More Replies
eliles
by New Contributor
  • 1000 Views
  • 1 replies
  • 0 kudos

Dashboard default date format changed from YYYY-MM-DD to DD/MM/YYYY

I noticed a change in date formatting behavior in my dashboards starting yesterday.Details:My Workspace Setting: Default date format is set to YYYY-MM-DD (confirmed in user settings)Issue: Any new date fields added to dashboards are defaulting to DD/...

  • 1000 Views
  • 1 replies
  • 0 kudos
Latest Reply
szymon_dybczak
Esteemed Contributor III
  • 0 kudos

Hi @eliles ,You can check dashboard settings. From there you can change locale configuration, i.e set number and date formatting standards based on geographic or cultural preferencesDashboard settings - Azure Databricks | Microsoft Learn

  • 0 kudos
Pilsner
by Databricks Partner
  • 2287 Views
  • 3 replies
  • 4 kudos

Resolved! Fabric one lake migration

We are trying to migrate our data from Fabric One Lake to the unity catalogue. Has anyone had experience with this before? Any pointers/ things to be aware of would be appreciated.

  • 2287 Views
  • 3 replies
  • 4 kudos
Latest Reply
Pilsner
Databricks Partner
  • 4 kudos

Hello @SP_6721 and  @nayan_wylde  Thank you both for your replies, I appreciate the pointers. I'll definitely look into your suggestions of creating an intermediate step instead of migrating directly from Fabric to databricks. 

  • 4 kudos
2 More Replies
rathorer
by Databricks Partner
  • 2659 Views
  • 0 replies
  • 0 kudos

Building Data Models on Databricks Platform

It will describe about Data models and how to build them on Databricks Platform, especially will Data vault and Data Mesh.Data VaultWhat is Data vault?Data Vault is a modern data modeling technique designed for agile, scalable, and auditable enterpri...

rathorer_0-1753270638873.png rathorer_15-1753272761286.png rathorer_16-1753272771387.png rathorer_18-1753272935114.png
  • 2659 Views
  • 0 replies
  • 0 kudos
AE10
by New Contributor III
  • 5058 Views
  • 5 replies
  • 12 kudos

Embed dashboards in websites and applications

I am embedding a dashboard in my react application using an iframe.https://www.databricks.com/blog/how-embed-aibi-dashboards-your-websites-and-applicationshttps://docs.databricks.com/en/dashboards/index.html#embed-a-dashboard1. How can i let my users...

  • 5058 Views
  • 5 replies
  • 12 kudos
Latest Reply
Preeti_Singh
New Contributor II
  • 12 kudos

Hi , Has anyone gotten this working? I am facing the same issue when trying to get the embedded URL working from an iframe in my React application. I think I have all the necessary permissions in place, but still, the Databricks sign-in page opens in...

  • 12 kudos
4 More Replies
Pilsner
by Databricks Partner
  • 2045 Views
  • 1 replies
  • 1 kudos

External connection to hive metastore

Hello,We are currently having an issue writing tables to the hive metastore. I've tried to outline what we know/have tried so far, below:Known situation:We have a databricks environment but it does not currently use Unity Catalogue. We instead have a...

  • 2045 Views
  • 1 replies
  • 1 kudos
Latest Reply
Louis_Frolio
Databricks Employee
  • 1 kudos

Here are some recommendations and tips/tricks:     1. Understanding the Architecture & Common Issues In legacy Databricks setups, the Hive Metastore is used to manage tables and their metadata. When writing via ODBC (using the Simba Spark driver) or ...

  • 1 kudos
Olfa_Kamli
by New Contributor III
  • 2134 Views
  • 3 replies
  • 3 kudos

Resolved! Unable to create Databricks SQL Endpoint using Terraform Clusters failing to launch

Hi All I'm trying to create a Databricks SQL Endpoint using Terraform with the following resource configuration:resource "databricks_sql_endpoint" "dataproduct_sql_endpoint" { provider = databricks.workspace name ...

Warehousing & Analytics
SQl endpoint
Terraform
warehouse
  • 2134 Views
  • 3 replies
  • 3 kudos
Latest Reply
Olfa_Kamli
New Contributor III
  • 3 kudos

@SP_6721just a quick heads-up, this is all sorted now. Turns out the compute (warehouse) was just missing a tag. Added it, and everything started working as expected! Thank u !! 

  • 3 kudos
2 More Replies
Sitharth
by Databricks Partner
  • 1474 Views
  • 2 replies
  • 3 kudos

Error: QUERY_RESULT_WRITE_TO_CLOUD_STORE_FAILED when querying system tables via Databricks dashboard

Hi Team,I'm encountering the following error when running queries on system tables through a Databricks dashboard using a Serverless SQL Warehouse:[QUERY_RESULT_WRITE_TO_CLOUD_STORE_FAILED] An internal error occurred while uploading the result set to...

  • 1474 Views
  • 2 replies
  • 3 kudos
Latest Reply
Khaja_Zaffer
Esteemed Contributor
  • 3 kudos

Hello @Sitharth Good dayI think the cause would be this: By default, containers are missing from the databricks managed storage accountResolution: you need to create new workspace With this we can close the case. 

  • 3 kudos
1 More Replies
omjohn
by New Contributor
  • 1489 Views
  • 1 replies
  • 0 kudos

Sparklyr error in spark_apply: Error: java.lang.NoSuchMethodError

When trying incorporate an R package into my Spark workflow using the spark_apply() funciton in Sparklyr, I get the error:Error: java.lang.NoSuchMethodError: org.apache.spark.sql.catalyst.encoders.RowEncoder$.apply(Lorg/apache/spark/sql/types/StructT...

  • 1489 Views
  • 1 replies
  • 0 kudos
Latest Reply
Vidhi_Khaitan
Databricks Employee
  • 0 kudos

Hi @omjohn Can you try downgrading to a Databricks Runtime 13.3 LTS, which uses Spark 3.4.x, officially supported by sparklyr 1.8.1. I believe it would provide a more stable and better-tested integration.

  • 0 kudos
DM3910
by Databricks Partner
  • 1494 Views
  • 1 replies
  • 1 kudos

How to Replicate unity catalog and workspaces from region to another in databricks

I have the need to replicate existing unity catalog and workspaces from one region to another in databricks. Need help in understanding the different ways in which it can be done

  • 1494 Views
  • 1 replies
  • 1 kudos
Latest Reply
TheOC
Databricks Partner
  • 1 kudos

Hey @DM3910 ,It may not be exactly what you need but there is some information in this article around cross-region/platform sharing:https://docs.databricks.com/aws/en/data-governance/unity-catalog/best-practices#cross-region-and-cross-platform-sharin...

  • 1 kudos