cancel
Showing results for 
Search instead for 
Did you mean: 
Warehousing & Analytics
Engage in discussions on data warehousing, analytics, and BI solutions within the Databricks Community. Share insights, tips, and best practices for leveraging data for informed decision-making.
cancel
Showing results for 
Search instead for 
Did you mean: 
Data + AI Summit 2024 - Data Warehousing, Analytics, and BI

Forum Posts

MadelynM
by Databricks Employee
  • 909 Views
  • 0 replies
  • 0 kudos

[Recap] Data + AI Summit 2024 - Warehousing & Analytics | Improve performance and increase insights

Here's your Data + AI Summit 2024 - Warehousing & Analytics recap as you use intelligent data warehousing to improve performance and increase your organization’s productivity with analytics, dashboards and insights.  Keynote: Data Warehouse presente...

Screenshot 2024-07-03 at 10.15.26 AM.png
Warehousing & Analytics
AI BI Dashboards
AI BI Genie
Databricks SQL
  • 909 Views
  • 0 replies
  • 0 kudos
paulocorrea
by New Contributor II
  • 907 Views
  • 3 replies
  • 0 kudos

Issue with Lateral Column Alias (LCA)

I have a query using LCA. When referencing another table that has a column with the same name as the column used as LCA, the behavior of the query changes and it starts referencing the table column instead of the column that is already in the select ...

paulocorrea_0-1722351334548.png paulocorrea_1-1722351445234.png
  • 907 Views
  • 3 replies
  • 0 kudos
Latest Reply
ClausStier
New Contributor II
  • 0 kudos

Hi @Kaniz_Fatma,we had the same problem as @paulocorrea.That's why it would be correct for to me to throw an error on ambiguous columns and the LCA could/must be addressed with a default identifier.Thanks

  • 0 kudos
2 More Replies
RickB
by New Contributor II
  • 534 Views
  • 3 replies
  • 1 kudos

SQL Positional parameters: INVALID_PARAMETER_MARKER_VALUE.DUPLICATE_NAME

When trying to execute a query via sql warehouse, I get the following error:INVALID_PARAMETER_MARKER_VALUE.DUPLICATE_NAMEthe sql statement uses ? placeholders and the correct number of arguments are being passed.I am not able to use named placeholder...

  • 534 Views
  • 3 replies
  • 1 kudos
Latest Reply
szymon_dybczak
Contributor III
  • 1 kudos

Hi @RickB ,Which API are you using to invoke this? Parameter markers can be provided by:Python using its pyspark.sql.SparkSession.sql() API.Scala using its org.apache.spark.sql.SparkSession.sql() API.Java using its org.apache.spark.sql.SparkSession.s...

  • 1 kudos
2 More Replies
Akshay_Petkar
by Contributor
  • 647 Views
  • 2 replies
  • 2 kudos

SQL Differences When Using SSMS with Databricks Lakehouse Federation

I'm planning to connect SQL Server Management Studio (SSMS) with Databricks using Lakehouse Federation. I understand that there are some differences in the SQL dialects between SSMS and Databricks SQL. For instance, in SSMS, we use TOP 10 to limit th...

  • 647 Views
  • 2 replies
  • 2 kudos
Latest Reply
-werners-
Esteemed Contributor III
  • 2 kudos

To add on this:if you really have to use T-SQL (the MS dialect of SQL), you can define the SQL warehouse from databricks as a linked server on your SQL server.As said: SSMS is merely a sql client, the SQL dialect to be used is defined by the database...

  • 2 kudos
1 More Replies
AZHAR-QUADRI
by New Contributor
  • 1003 Views
  • 1 replies
  • 0 kudos

How to create my First Dashboard in Lakeview

Hello Community . I am a newbie here having an experience in tableau and power bi . I wanted to explore Dashboard creation in  Lakeview . I have created a free trial databricks account . Although there are plenty of articles and videos on how to crea...

  • 1003 Views
  • 1 replies
  • 0 kudos
Latest Reply
szymon_dybczak
Contributor III
  • 0 kudos

Hi @AZHAR-QUADRI ,You probably created workspace in a standard tier, that's why you can't see side bar. Recreate your workspace as a premium tier.

  • 0 kudos
Hubert-Dudek
by Esteemed Contributor III
  • 12000 Views
  • 2 replies
  • 0 kudos

Optimizing SQL Databricks Warehouse Timeout Settings

Did you know the default timeout setting for SQL #databricks Warehouse is two days?The default timeout can be too long for most use cases. You can easily change this for your session or in the general SQL warehouse configuration.

ezgif-1-a917957651.gif
  • 12000 Views
  • 2 replies
  • 0 kudos
Latest Reply
rj_22588
New Contributor II
  • 0 kudos

Hi @Hubert-Dudek we have a java service which uses JDBC Template for connecting to databricks warehouse. We recently set a high connection idle timeout and maxLifeTime for our hikari pool connection. We are now seeing error related to invalid session...

  • 0 kudos
1 More Replies
Martin_Pham
by New Contributor III
  • 1074 Views
  • 3 replies
  • 1 kudos

Resolved! SQLWarehouse Case INsensitive

Does SQL Warehouse now have a feature that allows me to query all data without distinguishing between uppercase and lowercase?

  • 1074 Views
  • 3 replies
  • 1 kudos
Latest Reply
szymon_dybczak
Contributor III
  • 1 kudos

Hi @Martin_Pham ,At the current moment SQL Warehouses don't have this option. But according to one of the employees at databricks, this feature is coming soon: https://community.databricks.com/t5/warehousing-analytics/sqlwarehouse-case-insensitive/td...

  • 1 kudos
2 More Replies
pavel_cerny
by New Contributor II
  • 834 Views
  • 0 replies
  • 1 kudos

Databricks JDBC driver fails with socket read timeout

Application connects to Databricks serverless SQL warehouse via Databricks JDBC driver. It executes SQL select statements only. We see small number of statements failed each day with the following error detail: java.sql.SQLException: [Databricks][JDB...

  • 834 Views
  • 0 replies
  • 1 kudos
jon1
by New Contributor II
  • 1272 Views
  • 2 replies
  • 0 kudos

How to ensure all resources are released when closing a Connection via DataBricks JDBC Driver?

Hi there,I'm building a Sink that uses the Databricks JDBC Driver version 2.6.27. It appears that regardless of using a try-with-resources block and explicitly closing the Connection, the driver's internals are not releasing IdleConnectionEvictor thr...

  • 1272 Views
  • 2 replies
  • 0 kudos
Latest Reply
RakeshSen01
New Contributor II
  • 0 kudos

Still not fixed got 14k instances of com.databricks.client.jdbc42.internal.apache.http.impl.client.IdleConnectionEvictor 14,037 (0%) 673,776 B (0%) n/a And due to that application got to crashed. can you please help out as this is blocking to release...

  • 0 kudos
1 More Replies
swathiG
by New Contributor III
  • 874 Views
  • 3 replies
  • 1 kudos

Load backup file

Hi Team,There is a requirement to load a backup file into a database inside a SQL Warehouse. However, I don't see any option to directly load abackup file.I have tried reading the backup file using a notebook, but I’m unable to interpret the contents...

swathiG_0-1723483143998.png
  • 874 Views
  • 3 replies
  • 1 kudos
Latest Reply
swathiG
New Contributor III
  • 1 kudos

@szymon_dybczak  could you please suggest some other options for how I can proceed to load data from the database?

  • 1 kudos
2 More Replies
AndreasFuchs
by New Contributor
  • 319 Views
  • 0 replies
  • 0 kudos

Cost Tracing by accessed unity catalog catalog schema, sharing of sql warehouse

Hi community,we have several databricks workspaces per different teams and in each workspace several use case covered.How can we trace the costs per individual use cost for SQL Warehosue / serverless in case we use one SQL warehouse per workspace to ...

  • 319 Views
  • 0 replies
  • 0 kudos
Anonymous3
by New Contributor
  • 492 Views
  • 1 replies
  • 0 kudos

How to migrate Legacy Dashboard from one workspace to another workspace?

Is it possible to migrate Databricks Legacy Dashboards along with associated components such as queries and datasets from one workspace to another? I have attempted to export the Legacy Dashboards along with the queries and datasets, but upon importi...

  • 492 Views
  • 1 replies
  • 0 kudos
Latest Reply
holly
Databricks Employee
  • 0 kudos

Hello, to migrate between workspaces you'll need a different strategy based on the asset type Legacy dashboards you have completed, but for anyone reading you can use the 'export' functionalityQueries: you can either put them together in a folder and...

  • 0 kudos

Connect with Databricks Users in Your Area

Join a Regional User Group to connect with local Databricks users. Events will be happening in your city, and you won’t want to miss the chance to attend and share knowledge.

If there isn’t a group near you, start one and help create a community that brings people together.

Request a New Group
Labels
Top Kudoed Authors