cancel
Showing results for 
Search instead for 
Did you mean: 
Warehousing & Analytics
Engage in discussions on data warehousing, analytics, and BI solutions within the Databricks Community. Share insights, tips, and best practices for leveraging data for informed decision-making.
cancel
Showing results for 
Search instead for 
Did you mean: 

Forum Posts

MadelynM
by Databricks Employee
  • 3129 Views
  • 0 replies
  • 0 kudos

[Recap] Data + AI Summit 2024 - Warehousing & Analytics | Improve performance and increase insights

Here's your Data + AI Summit 2024 - Warehousing & Analytics recap as you use intelligent data warehousing to improve performance and increase your organization’s productivity with analytics, dashboards and insights.  Keynote: Data Warehouse presente...

Screenshot 2024-07-03 at 10.15.26 AM.png
Warehousing & Analytics
AI BI Dashboards
AI BI Genie
Databricks SQL
  • 3129 Views
  • 0 replies
  • 0 kudos
igorstar
by New Contributor III
  • 6570 Views
  • 3 replies
  • 2 kudos

Resolved! What is the difference between LIVE TABLE and MATERIALIZED VIEW?

From the DLT documentation it seems that the LIVE TABLE is conceptually the same as MATERIALIZED VIEW. When should I use one over another?

  • 6570 Views
  • 3 replies
  • 2 kudos
Latest Reply
Mo
Databricks Employee
  • 2 kudos

@ImranA and @igorstar  I repost my response here again:to create materialized views, you could use CREATE OR REFRESH LIVE TABLE however according to the official docs: The CREATE OR REFRESH LIVE TABLE syntax to create a materialized view is deprecat...

  • 2 kudos
2 More Replies
AnnaP
by New Contributor II
  • 1830 Views
  • 1 replies
  • 0 kudos

[UNBOUND_SQL_PARAMETER] error

Hi, I'd appreciate it if anyone could help!We are using offical ODBC driver (Simba Spark ODBC Driver 64-bit 2.08.02.1013) in our application. C++ APIs.All the following SQL statements are passed through ODBC API to Databricks:successully executing: C...

  • 1830 Views
  • 1 replies
  • 0 kudos
Latest Reply
PiotrMi
Contributor
  • 0 kudos

@AnnaP Hey,Did you try below:To disable the SQL Connector feature, select the Use Native Query check box.Important:l When this option is enabled, the connector cannot executeparameterized queries.l By default, the connector applies transformations to...

  • 0 kudos
Akshay_Petkar
by Valued Contributor
  • 2144 Views
  • 1 replies
  • 2 kudos

How to Create a Live Streaming Dashboard on Databricks?

I am working on a use case where I have streaming data that needs to be displayed in real-time on a live dashboard. The goal is for any new data arriving in the stream to instantly reflect on the dashboard. Is this possible on Databricks? If yes, how...

  • 2144 Views
  • 1 replies
  • 2 kudos
Latest Reply
christopher356
New Contributor II
  • 2 kudos

@Akshay_Petkar wrote:I am working on a use case where I have streaming data that needs to be displayed in real-time on a live dashboard. The goal is for any new data arriving in the stream to instantly reflect on the dashboard. Is this possible on Da...

  • 2 kudos
User16753724663
by Databricks Employee
  • 15355 Views
  • 5 replies
  • 3 kudos

Resolved! Unable to use CX_Oracle library in notebook

While using cx_oracle python library, it returns the below error:   error message: Cannot locate a 64-bit Oracle Client library: "libclntsh.so: cannot open shared object file: No such file or directory     The cx_oracle library is dependent on native...

  • 15355 Views
  • 5 replies
  • 3 kudos
Latest Reply
ovbieAmen
New Contributor II
  • 3 kudos

Hi @AshvinManoj  I used your script and still get same error.   sudo echo 'LD_LIBRARY_PATH="/dbfs/databricks/instantclient_23_6"' >> /databricks/spark/conf/spark-env.shsudo echo 'ORACLE_HOME="/dbfs/databricks/instantclient_23_6"' >> /databricks/spark...

  • 3 kudos
4 More Replies
mbhakta
by New Contributor II
  • 6831 Views
  • 3 replies
  • 2 kudos

Change Databricks Connection on Power BI (service)

We're creating a report with Power BI using data from our AWS Databricks workspace. Currently, I can view the report on Power BI (service) after publishing. Is there a way to change the data source connection, e.g. if I want to change the data source...

  • 6831 Views
  • 3 replies
  • 2 kudos
Latest Reply
J_C
New Contributor II
  • 2 kudos

In the Power BI transform data view, you should be able to access the M-Query code and actually change the server and the host directly. My recommendation is to create a couple of parameters to keep this info for all your queries. Then you can just c...

  • 2 kudos
2 More Replies
anardinelli
by Databricks Employee
  • 889 Views
  • 0 replies
  • 2 kudos

How do I dimension my DBSQL warehouse correctly?

What is the optimal number of cluster/nodes in a warehouse? Depends on your workload. Our DBSQL guide suggests a size range on two main things: time to execute your query and bytes spilled from it. This link can help you understand better optimizatio...

  • 889 Views
  • 0 replies
  • 2 kudos
bradleyjamrozik
by New Contributor III
  • 3441 Views
  • 3 replies
  • 0 kudos

ODBC Connection Does Not Disconnect

I have an on-premises Power BI Report Server that uses the Simba Spark ODBC Driver (2.8) to connect to Databricks. It can connect to a serverless warehouse successfully and run its queries, but it never seems to disconnect the session, and so the war...

  • 3441 Views
  • 3 replies
  • 0 kudos
Latest Reply
gmiguel
Contributor
  • 0 kudos

Hi @bradleyjamrozik ,The problem is that the current designed behavior of Power BI can lead to the connection not being closed under certain conditions.In short, since Databricks is Async, if an error occurs while fetching data, the connection may no...

  • 0 kudos
2 More Replies
Akshay_Petkar
by Valued Contributor
  • 1983 Views
  • 2 replies
  • 1 kudos

Best Approach for Migrating from EMR to Databricks

I am preparing to migrate from EMR to Databricks and would like to know the best practices for this process. Is there a direct connector, such as a JDBC connector, available to facilitate the migration? Alternatively, would it be more effective to ex...

  • 1983 Views
  • 2 replies
  • 1 kudos
Latest Reply
thelogicplus
Contributor II
  • 1 kudos

@Akshay_Petkar if you're planning to migrate from your current technology to Databricks, Travinto Technologies' Code Converter Tool is here to make the process seamless. This powerful tool enables you to migrate data, ETL workflows, and reports acros...

  • 1 kudos
1 More Replies
San2
by New Contributor
  • 3308 Views
  • 1 replies
  • 0 kudos

recover a deleted workspace

I cancel my subscription plan (account created via accounts.databricks.com), without noticing that by doing this, all of my workspaces will be deleted. Is this possible to recover those workspace, and if possible, what should I do to recover them? 

  • 3308 Views
  • 1 replies
  • 0 kudos
Latest Reply
steyler-db
Databricks Employee
  • 0 kudos

Hello San2.   To recover a workspace, could you please raise a case with support? They will be helping you to recover these workspaces, otherwise you can reach your account team, who will help you for further steps, for more details you see disaster ...

  • 0 kudos
Ismail1
by New Contributor III
  • 4911 Views
  • 3 replies
  • 0 kudos

Migrating from Databases Postgres MySQL to Databricks.

Hi all, working on this project, my team plans to migrate some data from some databases to Databricks. We plan to run this migration by submitting queries to a warehouse through python on a local machine.Now I was wondering what would be the best app...

  • 4911 Views
  • 3 replies
  • 0 kudos
Latest Reply
thelogicplus
Contributor II
  • 0 kudos

@Ismail1  if you're planning to migrate from your current technology to Databricks, Travinto Technologies' Code Converter Tool is here to make the process seamless. This powerful tool enables you to migrate data, ETL workflows, and reports across pla...

  • 0 kudos
2 More Replies
martindlarsson
by New Contributor III
  • 1571 Views
  • 3 replies
  • 2 kudos

Warning in dbt task

We are using dbt core and running our transformations in a dbt task. Since a few months ago we have started to see a warning message in our runs. This  started happening after an update of dbt core and it seems that Databricks needs to update the def...

  • 1571 Views
  • 3 replies
  • 2 kudos
Latest Reply
martindlarsson
New Contributor III
  • 2 kudos

Both of you missed the sentence "Databricks needs to update the definition of the generated profiles file."The warning is regarding the structure of the profiles.yml which is generated by Databrick. This is not anything I as a user can change accept ...

  • 2 kudos
2 More Replies
kgilson
by New Contributor
  • 1238 Views
  • 1 replies
  • 0 kudos

Databricks Apps connection to Azure SQL server

Hello,I am new to Databricks and was asked to evaluate the new Apps function as a way to publish dashboards.  We do not use Databricks today.  We have an Azure SQL database we use as a home grown warehouse.  The question is can I connect directly to ...

  • 1238 Views
  • 1 replies
  • 0 kudos
Latest Reply
Walter_C
Databricks Employee
  • 0 kudos

To connect your Azure SQL database to Databricks Apps, you will need to pull the data from the existing database into Databricks. Databricks Apps are designed to leverage the data and features within the Databricks platform, which means that the data...

  • 0 kudos
sruthianki
by New Contributor II
  • 1178 Views
  • 3 replies
  • 0 kudos

Migrating huge table from synapse to databricks

Hi ,We are looking for a option to copy tables more than 50 TB to be copied from syanpse to databricks on weekly basis , please suggest of there are any feasible ways for samewe are using connector but it is taking too long to copyhttps://learn.micro...

  • 1178 Views
  • 3 replies
  • 0 kudos
Latest Reply
-werners-
Esteemed Contributor III
  • 0 kudos

There is no databricks documentation on this as it is only involved for a very tiny bit:"CREATE TABLE catalog.schema.table USING PARQUET LOCATION 'url_to_the_parquet_files'.All the rest is done in Azure Data Factory, or you can even use the built-in ...

  • 0 kudos
2 More Replies
vaishalisai
by New Contributor II
  • 1051 Views
  • 2 replies
  • 0 kudos

Account client not able to get workspace client

So I did U2M authentication for account-level operations -databricks auth login --host <account-login-url> --account-id <account-id>then I tried to run code-workspaces = account_client.workspaces.list()workspace_obj=account_client.get_workspace_clien...

  • 1051 Views
  • 2 replies
  • 0 kudos
Latest Reply
vaishalisai
New Contributor II
  • 0 kudos

I am doing using a service principal so my .databrickscfg has -[databricks-demo]client_id = -----client_secret = ----host = https://accounts.cloud.databricks.com/account_id = ----   

  • 0 kudos
1 More Replies
mtr_nx
by New Contributor II
  • 1522 Views
  • 2 replies
  • 0 kudos

Dashboarding Tooltips - How-To's?

Hi Folks, I am looking for any documentation on how Databricks Dashboards handle creating tooltips. Specifically I am interested in dashboards created in the SQL Warehouses (not Workspace Clusters, since using libraries like plotly can have custom to...

  • 1522 Views
  • 2 replies
  • 0 kudos
Latest Reply
mtr_nx
New Contributor II
  • 0 kudos

Thank you for sharing this. Unfortunately it doesn't really contain much information on how to manage the tooltips that pop up when you hover over a bar or line in a chart/figure. That's what I am looking to customize in (or at least remove from) my ...

  • 0 kudos
1 More Replies