cancel
Showing results for 
Search instead for 
Did you mean: 
Warehousing & Analytics
Engage in discussions on data warehousing, analytics, and BI solutions within the Databricks Community. Share insights, tips, and best practices for leveraging data for informed decision-making.
cancel
Showing results for 
Search instead for 
Did you mean: 
Data + AI Summit 2024 - Data Warehousing, Analytics, and BI

Forum Posts

MadelynM
by New Contributor III
  • 131 Views
  • 0 replies
  • 0 kudos

[Recap] Data + AI Summit 2024 - Warehousing & Analytics | Improve performance and increase insights

Here's your Data + AI Summit 2024 - Warehousing & Analytics recap as you use intelligent data warehousing to improve performance and increase your organization’s productivity with analytics, dashboards and insights.  Keynote: Data Warehouse presente...

Screenshot 2024-07-03 at 10.15.26 AM.png
Warehousing & Analytics
AI BI Dashboards
AI BI Genie
Databricks SQL
  • 131 Views
  • 0 replies
  • 0 kudos
hank12345
by New Contributor
  • 209 Views
  • 1 replies
  • 0 kudos

Can I enable Okta authentication while using Databricks ODBC driver?

We're currently using a private cloud implementation of Databricks on AWS (not SaaS version), and the Databricks/Simba ODBC driver v2.6.29 with personal access token to fetch data into desktop tools. Can I enable Okta web/mobile authentication with t...

  • 209 Views
  • 1 replies
  • 0 kudos
Latest Reply
Kaniz_Fatma
Community Manager
  • 0 kudos

Hi @hank12345,  First, check if you’re using the latest version of the Databricks ODBC driver. The ODBC driver version 2.7.5 and above supports OAuth user-to-machine (U2M) authentication for Databricks users.If you’re using an older version (such as ...

  • 0 kudos
acurus
by New Contributor II
  • 1804 Views
  • 3 replies
  • 2 kudos

Not able to escape `-` in external connected tablename

We are having some issues with getting data from some tables with the character `-` in their tablename.We are connected to the database with an SQL server connection, and the database is (as far as we know)  a Microsoft Azure SQL Database. We do not ...

  • 1804 Views
  • 3 replies
  • 2 kudos
Latest Reply
martinschou
New Contributor II
  • 2 kudos

Had the same issue when querying a table with the - character in the table name. Got the error: Incorrect syntax near '-'.Got the error on Databricks runtime version: 13.2 (includes Apache Spark 3.4.0, Scala 2.12)No error when using Databricks runtim...

  • 2 kudos
2 More Replies
ssequ
by New Contributor II
  • 1143 Views
  • 3 replies
  • 2 kudos

Resolved! Backspaces in Foreign Catalog Table Names

Hi,is there a way to import tables with UC disallowed characters when creating a foreign catalog? The database we are dealing with contains table names with backspaces and UC seems to completely ignore them when creating the catalog.Thanks for any he...

  • 1143 Views
  • 3 replies
  • 2 kudos
Latest Reply
daniel_sahal
Esteemed Contributor
  • 2 kudos

@ssequ I assume that you're refering to lakehouse federation? Unfortunately that's a limitation, tables with disallowed characters will be ignored.

  • 2 kudos
2 More Replies
vishva-fivetran
by New Contributor II
  • 631 Views
  • 2 replies
  • 0 kudos

Databricks query on a warehouse failing with ` spark.driver.maxResultSize` error

we are trying to run a select * from one of our catalog tables. we are seeing the error : org.apache.hive.service.cli.HiveSQLException: Error running query: org.apache.spark.SparkException: Job aborted due to stage failure: Total size of serialized r...

  • 631 Views
  • 2 replies
  • 0 kudos
Latest Reply
anardinelli
New Contributor III
  • 0 kudos

Hello @vishva-fivetran how are you? To set the spark.driver.maxResultSize property, you can do so in the cluster Spark config. The property spark.driver.maxResultSize can be set to a value higher than the value reported in the exception message. For ...

  • 0 kudos
1 More Replies
Akshay_Petkar
by New Contributor III
  • 480 Views
  • 1 replies
  • 1 kudos

Resolved! Is Photon Enabled by Default for Warehouses in Databricks?

When creating a cluster in Databricks, I notice an option for enabling Photon. However, this option does not appear when configuring a warehouse. Is Photon enabled by default for warehouses in Databricks, or is there a different procedure to enable i...

  • 480 Views
  • 1 replies
  • 1 kudos
Latest Reply
Yeshwanth
Honored Contributor
  • 1 kudos

@Akshay_Petkar good day! Photon is enabled by default for Databricks SQL warehouses. There is no need for a separate procedure to enable it for warehouses. For clusters, you have the option to manually enable or disable Photon by selecting the "Use P...

  • 1 kudos
DominikBraun
by New Contributor II
  • 610 Views
  • 2 replies
  • 1 kudos

Resolved! SQL Warehouse: INVALID_PARAMETER_VALUE when starting

Hey everybody.When creating a SQL Warehouse and trying to start it, I get the following error message:Clusters are failing to launch. Cluster launch will be retried.Request to create a cluster failed with an exception: INVALID_PARAMETER_VALUE: Cannot...

  • 610 Views
  • 2 replies
  • 1 kudos
Latest Reply
Ismael-K
New Contributor III
  • 1 kudos

One suggestion would be if you could please have the workspace admin check the Data Access Configuration properties in the Workspace Settings for any secrets that may be held which the warehouse is trying to access on start up? These data access prop...

  • 1 kudos
1 More Replies
Hubert-Dudek
by Esteemed Contributor III
  • 6733 Views
  • 2 replies
  • 0 kudos

Optimizing SQL Databricks Warehouse Timeout Settings

Did you know the default timeout setting for SQL #databricks Warehouse is two days?The default timeout can be too long for most use cases. You can easily change this for your session or in the general SQL warehouse configuration.

ezgif-1-a917957651.gif
  • 6733 Views
  • 2 replies
  • 0 kudos
Latest Reply
Hertz
New Contributor II
  • 0 kudos

If the default is set to 2 days why would my tableau extract refresh jobs that utilize and sql warehouse time out due to inactivity after 10 or 20 mins.

  • 0 kudos
1 More Replies
Akshay_Petkar
by New Contributor III
  • 803 Views
  • 1 replies
  • 0 kudos

Resolved! Databricks warehouse cost calculation

I would like to know how the cost for a warehouse in Databricks is calculated. Specifically, if the cost for a 2x small warehouse is 4 DBUs per hour, how is the cost determined if I use the warehouse for only 30 minutes and then terminate it? Will it...

  • 803 Views
  • 1 replies
  • 0 kudos
Latest Reply
daniel_sahal
Esteemed Contributor
  • 0 kudos

@Akshay_Petkar As it states in Databricks documentation:Databricks offers you a pay-as-you-go approach with no up-front costs. Only pay for the products you use at per second granularity.It will charge 2 DBUs.

  • 0 kudos
Akshay_Petkar
by New Contributor III
  • 301 Views
  • 1 replies
  • 0 kudos

Facing Data Truncation Issues in Databricks Dashboards

I'm encountering data truncation in my Databricks dashboards. I'm working with a large dataset, and the dashboard only displays a limited number of (truncated) rows.let's take a dataset containing 1 million sales records. The dashboard currently only...

  • 301 Views
  • 1 replies
  • 0 kudos
Latest Reply
NandiniN
Honored Contributor
  • 0 kudos

Hi @Akshay_Petkar , This is simple in Databricks SQL, just uncheck LIMIT 1000 in the drop down. https://docs.databricks.com/en/sql/get-started/visualize-data-tutorial.html Limit 1000 is selected by default for all queries to ensure that the query ret...

  • 0 kudos
andre_rizzatti
by New Contributor II
  • 1189 Views
  • 1 replies
  • 1 kudos

Resolved! SQLWarehouse Case INsensitive

Good morning, is there any parameter or configuration that causes all my data to be consulted without case distinction? insensitive?

  • 1189 Views
  • 1 replies
  • 1 kudos
Latest Reply
raphaelblg
Honored Contributor
  • 1 kudos

Hello @andre_rizzatti,  At the current moment SQL Warehouses don't have this option. This feature is coming soon.  

  • 1 kudos
amelia1
by New Contributor II
  • 405 Views
  • 1 replies
  • 0 kudos

Local pyspark read data using jdbc driver returns column names only

Hello,I have an Azure sql warehouse serverless instance that I can connect to using databricks-sql-connector. But, when I try to use pyspark and jdbc driver url, I can't read or write.See my code belowdef get_jdbc_url(): # Define your Databricks p...

  • 405 Views
  • 1 replies
  • 0 kudos
Latest Reply
Kaniz_Fatma
Community Manager
  • 0 kudos

Hi @amelia1, The error messages you provided indicate that there might be a problem with the log4j configuration and formatting. Additionally, the repeated column names suggest that there might be an issue with how the data is being retrieved. Her...

  • 0 kudos
mbhakta
by New Contributor II
  • 2782 Views
  • 3 replies
  • 2 kudos

Change Databricks Connection on Power BI (service)

We're creating a report with Power BI using data from our AWS Databricks workspace. Currently, I can view the report on Power BI (service) after publishing. Is there a way to change the data source connection, e.g. if I want to change the data source...

  • 2782 Views
  • 3 replies
  • 2 kudos
Latest Reply
Srushti
New Contributor II
  • 2 kudos

Have you got any solution for this?

  • 2 kudos
2 More Replies
diego_poggioli
by Contributor
  • 2069 Views
  • 5 replies
  • 2 kudos

Resolved! Nested subquery is not supported in the DELETE condition

According to the documentation the WHERE predicate in a DELETE statement should supports subqueries, including IN, NOT IN, EXISTS, NOT EXISTS, and scalar subqueries.if I try to run a query like:   DELETE FROM dev.gold.table AS trg WHERE EXISTS ( ...

  • 2069 Views
  • 5 replies
  • 2 kudos
Latest Reply
Tejas2022
New Contributor II
  • 2 kudos

@diego_poggioli Can you try selecting a 'year_month_version' column from the view instead of select * DELETE FROM dev.gold.table AS trg WHERE year_month_version IN (select year_month_version FROM v_distinct_year_month_version)  

  • 2 kudos
4 More Replies
Join 100K+ Data Experts: Register Now & Grow with Us!

Excited to expand your horizons with us? Click here to Register and begin your journey to success!

Already a member? Login and join your local regional user group! If there isn’t one near you, fill out this form and we’ll create one for you to join!

Labels
Top Kudoed Authors