cancel
Showing results for 
Search instead for 
Did you mean: 
Warehousing & Analytics
Engage in discussions on data warehousing, analytics, and BI solutions within the Databricks Community. Share insights, tips, and best practices for leveraging data for informed decision-making.
cancel
Showing results for 
Search instead for 
Did you mean: 
Data + AI Summit 2024 - Data Warehousing, Analytics, and BI

Forum Posts

MadelynM
by Databricks Employee
  • 925 Views
  • 0 replies
  • 0 kudos

[Recap] Data + AI Summit 2024 - Warehousing & Analytics | Improve performance and increase insights

Here's your Data + AI Summit 2024 - Warehousing & Analytics recap as you use intelligent data warehousing to improve performance and increase your organization’s productivity with analytics, dashboards and insights.  Keynote: Data Warehouse presente...

Screenshot 2024-07-03 at 10.15.26 AM.png
Warehousing & Analytics
AI BI Dashboards
AI BI Genie
Databricks SQL
  • 925 Views
  • 0 replies
  • 0 kudos
uberweiss
by New Contributor II
  • 4011 Views
  • 1 replies
  • 0 kudos

Unable to access Databricks cluster through ODBC in R

We have previously been able to access our Databricks cluster in R using ODBC but it stopped working a couple of months ago and now i can't get it to connect.I've downloaded the latest drivers and added the right information in odbc/odbcinst files bu...

Warehousing & Analytics
cluster
Databricks
ODBC
R
  • 4011 Views
  • 1 replies
  • 0 kudos
Latest Reply
" src="" />
This widget could not be displayed.
This widget could not be displayed.
This widget could not be displayed.
  • 0 kudos

This widget could not be displayed.
We have previously been able to access our Databricks cluster in R using ODBC but it stopped working a couple of months ago and now i can't get it to connect.I've downloaded the latest drivers and added the right information in odbc/odbcinst files bu...

This widget could not be displayed.
  • 0 kudos
This widget could not be displayed.
peterlandis
by New Contributor II
  • 2578 Views
  • 1 replies
  • 0 kudos

Calculate the total size in bytes for a column

I wanted to calculate the total size in bytes for a given column for a table.  I saw that you can use the bit_length function and did something like this giving you the total bits of the column but not sure if this is correct.SELECT sum(bit_length(to...

  • 2578 Views
  • 1 replies
  • 0 kudos
Latest Reply
-werners-
Esteemed Contributor III
  • 0 kudos

I looked at the docs of bit_length and it does not state if it is before or after compression.However since spark decompresses data on read, it is very likely it is the size before compression.The table size is read from metadata and is compressed.To...

  • 0 kudos
MonikaSamant
by New Contributor II
  • 2759 Views
  • 3 replies
  • 2 kudos

Need to connect Looker studio with Databricks tables

Hi Team, I am creating my data warehouse on AWS s3 and corresponding tables on databricks warehouse.I want to connect looker studio (which is different from looker) to these databricks tables and be able to create reports.Could you please help us on ...

  • 2759 Views
  • 3 replies
  • 2 kudos
Latest Reply
807326
New Contributor II
  • 2 kudos

We are also interested in this functionality. But there are no databricks connectors for Looker Studio. But as Databricks SQL Statement Execution REST API is already available in public preview, it is now possible to Build a Community Connector for d...

  • 2 kudos
2 More Replies
charlie_cai
by New Contributor II
  • 1501 Views
  • 1 replies
  • 3 kudos

What is the difference between :443/default and Database=default in JDBC connection string

When I use following java code to get namespace from AWD Databricks:  import java.sql.Connection; import java.sql.DriverManager; import java.sql.ResultSet; import java.sql.Statement; import java.util.Properties; public class DatabricksJDBCExample { ...

  • 1501 Views
  • 1 replies
  • 3 kudos
Latest Reply
Tharun-Kumar
Databricks Employee
  • 3 kudos

@charlie_cai database is not a valid configuration parameter available in the jdbc string. You can use ConnCatalog and ConnSchema to provide this information.This is also documented here - https://docs.databricks.com/en/integrations/jdbc-odbc-bi.html...

  • 3 kudos
Meagan
by New Contributor III
  • 2134 Views
  • 1 replies
  • 0 kudos

Resolved! Delta Sharing lists tables but says "access to resource is forbidden" when reading table contents

I am using Power BI Desktop to connect to Unity Catalog using Delta Sharing. When I connect, I enter my endpoint, enter my bearer token, browse the catalog and can see my tables. But when I try to preview or load a table, I get the error "access to r...

Meagan_0-1690388250151.png
  • 2134 Views
  • 1 replies
  • 0 kudos
Latest Reply
Meagan
New Contributor III
  • 0 kudos

It turned out we had to allow my IP address on the storage account used by Unity Catalog. I guess i wasn't expecting to need that for Delta Sharing, but that indeed did fix the problem. 

  • 0 kudos
DennisD
by New Contributor
  • 771 Views
  • 0 replies
  • 0 kudos

Notebook stuck on Initializing RocksDB

Hi,While running a notebook during a nightrun on Azure Databricks, it got stuck on Initializing RocksDB. We are not using any streaming data nor have enabled RocksDB. Anyone has any clue how to disable RocksDB or prevent this in the future?Thanks! 

  • 771 Views
  • 0 replies
  • 0 kudos
dprutean
by New Contributor III
  • 978 Views
  • 1 replies
  • 1 kudos

JDBC DatabaseMetaData.getColumns().getComments() encoding issue

I am using the JDBC driver to load comments saved in Databricks, associated to tables and columns.Comments saved in Chinese are returned in the bad encoding. I useDatabaseMetaData.getColumns().getComments()

  • 978 Views
  • 1 replies
  • 1 kudos
Latest Reply
saipujari_spark
Databricks Employee
  • 1 kudos

@dprutean Can you share an example with the expected results vs the actual results?

  • 1 kudos
eimis_pacheco
by Contributor
  • 3092 Views
  • 3 replies
  • 5 kudos

Resolved! Why companies use databricks SQL and Redshift at the same time?

Hi community,I suddenly found myself confused, and this might sound like an obvious answer for some but not for me at least in this moment.I am not getting why companies use databricks SQL and Redshift at the same time?I mean, with databricks platfor...

  • 3092 Views
  • 3 replies
  • 5 kudos
Latest Reply
Anonymous
Not applicable
  • 5 kudos

Hi @eimis_pacheco  We haven't heard from you since the last response from @-werners- , and I was checking back to see if her suggestions helped you. Or else, If you have any solution, please share it with the community, as it can be helpful to others...

  • 5 kudos
2 More Replies
Testing2
by New Contributor II
  • 2800 Views
  • 6 replies
  • 0 kudos

Unable start SQL warehouse AWS

Unable start SQL warehouse AWS. Warehouse is in starting state for a very long time and error is thrown

  • 2800 Views
  • 6 replies
  • 0 kudos
Latest Reply
rdkarthikeyan27
New Contributor II
  • 0 kudos

Can you please check on particular SQL warehouse instances type quota with cloud provider (Azure/AWS) . may be that particular instance type quota is over with your account . Same issue we have faced and requested AWS to increase the quota of that in...

  • 0 kudos
5 More Replies
dsugs
by New Contributor II
  • 9534 Views
  • 4 replies
  • 2 kudos

Resolved! I'm curious if anyone has ever written a file to S3 with a custom file name?

So I've been trying to write a file to S3 bucket giving it a custom name, everything I try just ends up with the file being dumped into a folder with the specified name so the output is like ".../file_name/part-001.parquet". instead I want the file t...

  • 9534 Views
  • 4 replies
  • 2 kudos
Latest Reply
rdkarthikeyan27
New Contributor II
  • 2 kudos

Spark feature where to avoid network io it writes each shuffle partition as a 'part...' file on disk and each file as you said will have compression and efficient encoding by default.So Yes it is directly related to parallel processing !!

  • 2 kudos
3 More Replies
scvbelle
by New Contributor III
  • 4818 Views
  • 5 replies
  • 5 kudos

Resolved! Recommended ETL workflow for weekly ingestion of .sql.tz "database dumps" from Blob Storage into Unity Catalogue-enabled Metastore

The client receives data from a third party as weekly "datadumps" of a MySQL database copied into an Azure Blob Storage account container (I suspect this is done manually, I also suspect the changes between the approx 7GB files are very small). I nee...

  • 4818 Views
  • 5 replies
  • 5 kudos
Latest Reply
Anonymous
Not applicable
  • 5 kudos

Hi @Sylvia VB​ Thank you for posting your question in our community! We are happy to assist you.To help us provide you with the most accurate information, could you please take a moment to review the responses and select the one that best answers you...

  • 5 kudos
4 More Replies

Connect with Databricks Users in Your Area

Join a Regional User Group to connect with local Databricks users. Events will be happening in your city, and you won’t want to miss the chance to attend and share knowledge.

If there isn’t a group near you, start one and help create a community that brings people together.

Request a New Group
Labels
Top Kudoed Authors