cancel
Showing results for 
Search instead for 
Did you mean: 
Data Engineering
Join discussions on data engineering best practices, architectures, and optimization strategies within the Databricks Community. Exchange insights and solutions with fellow data engineers.
cancel
Showing results for 
Search instead for 
Did you mean: 

Forum Posts

VVM
by New Contributor III
  • 19684 Views
  • 14 replies
  • 3 kudos

Resolved! Databricks SQL - Unable to Escape Dollar Sign ($) in Column Name

It seems that due to how Databricks processes SQL cells, it's impossible to escape the $ when it comes to a column name.I would expect the following to work:%sql SELECT 'hi' `$id`The backticks ought to escape everything. And indeed that's exactly wha...

  • 19684 Views
  • 14 replies
  • 3 kudos
Latest Reply
rgower
New Contributor II
  • 3 kudos

+1 here - hoping to hear any updates.

  • 3 kudos
13 More Replies
yopbibo
by Contributor II
  • 22482 Views
  • 5 replies
  • 4 kudos

How can I connect to an Azure SQL db from a Databricks notebook?

I know how to do it with spark, and read/write tables (like https://docs.microsoft.com/en-gb/azure/databricks/data/data-sources/sql-databases#python-example )But this time, I need to only update a field of a specific row in a table. I do not think I ...

  • 22482 Views
  • 5 replies
  • 4 kudos
Latest Reply
raopheefah
New Contributor II
  • 4 kudos

Look at your compute configuration. Looks like this works perfectly on Dedicated (formerly: single user) or No isolation clusters, but not on Standard (formerly: Shared) ones.Maybe you need a disposable one-time job cluster with these settings.

  • 4 kudos
4 More Replies
prasadvaze
by Valued Contributor II
  • 8102 Views
  • 4 replies
  • 6 kudos

Resolved! Limit on number of result rows displayed on databricks SQL UI

Databricks SQL UI currently limits the query results display to 64000 rows. When will this limit go away? Using SSMS I get 40MM rows results in the UI and my users won't switch to databricks SQL for this reason

  • 8102 Views
  • 4 replies
  • 6 kudos
Latest Reply
User16765136105
New Contributor III
  • 6 kudos

Hi @prasad vaze​ - We do have a feature in the works that will increase this limit. If you reach out to your Databricks contact they can give you more details regarding dates and the preview.

  • 6 kudos
3 More Replies
DataGirl
by New Contributor
  • 14666 Views
  • 6 replies
  • 2 kudos

Multi value parameter on Power BI Paginated / SSRS connected to databricks using ODBC

Hi All, I'm wondering if anyone has had any luck setting up multi valued parameters on SSRS using ODBC connection to Databricks? I'm getting "Cannot add multi value query parameter" error everytime I change my parameter to multi value. In the query s...

  • 14666 Views
  • 6 replies
  • 2 kudos
Latest Reply
ssrsnat
New Contributor II
  • 2 kudos

Hi I am working on having SSRS reports access Databricks and facing similar challenges. I see you tried this back in 2022. Can you please advice the approach to handle the multi value parameters? Thanks Sam

  • 2 kudos
5 More Replies
sanq
by New Contributor II
  • 5707 Views
  • 3 replies
  • 7 kudos

what formatter is used to format SQL cell in databricks

Databricks launched formatter Black which formats python cells, I can also see SQL cell getting formatted, but not sure which formatter is being used for SQL cell formatting. No clarity given on docs.

  • 5707 Views
  • 3 replies
  • 7 kudos
Latest Reply
mitch_DE
New Contributor II
  • 7 kudos

The formatter is mentioned here: Develop code in Databricks notebooks - Azure Databricks | Microsoft LearnIt is this npm package: @gethue/sql-formatter - npm

  • 7 kudos
2 More Replies
Ajay-Pandey
by Esteemed Contributor III
  • 6491 Views
  • 5 replies
  • 5 kudos

Support of running multiple cells at a time in databricks notebook Hi all,Now databricks notebook supports parallel run of commands in a single notebo...

Support of running multiple cells at a time in databricks notebookHi all,Now databricks notebook supports parallel run of commands in a single notebook that will help run ad hoc queries simultaneously without creating a separate notebook.Once you run...

image.png image
  • 6491 Views
  • 5 replies
  • 5 kudos
Latest Reply
SunilUIIT
New Contributor II
  • 5 kudos

Hi Team,I am observing that the functionality is not working as expected in the Trial workspace of Databricks. Is there a setting that needs to be enabled to allow independent SQL cells in a Databricks notebook to run in parallel, while dependent cel...

  • 5 kudos
4 More Replies
najmead
by Contributor
  • 25489 Views
  • 7 replies
  • 13 kudos

How to convert string to datetime with correct timezone?

I have a field stored as a string in the format "12/30/2022 10:30:00 AM"If I use the function TO_DATE, I only get the date part... I want the full date and time.If I use the function TO_TIMESTAMP, I get the date and time, but it's assumed to be UTC, ...

  • 25489 Views
  • 7 replies
  • 13 kudos
Latest Reply
Rajeev_Basu
Contributor III
  • 13 kudos

use from_utc_timestamp(to_timestam("<string>", <format>),<timezone>)

  • 13 kudos
6 More Replies
LightUp
by New Contributor III
  • 8530 Views
  • 3 replies
  • 4 kudos

Converting SQL Code to SQL Databricks

I am new to Databricks. Please excuse my ignorance. My requirement is to convert the SQL query below into Databricks SQL. The query comes from EventLog table and the output of the query goes into EventSummaryThese queries can be found hereCREATE TABL...

image
  • 8530 Views
  • 3 replies
  • 4 kudos
Latest Reply
thelogicplus
Contributor
  • 4 kudos

you may explore the tool and services from Travinto Technologies . They have very good tools. We had explored their tool for our code coversion from  Informatica, Datastage and abi initio to DATABRICKS , pyspark. Also we used for SQL queries, stored ...

  • 4 kudos
2 More Replies
dimsh
by Contributor
  • 16755 Views
  • 13 replies
  • 10 kudos

How to overcome missing query parameters in Databricks SQL?

Hi, there! I'm trying to build up my first dashboard based on Dataabricks SQL. As far as I can see if you define a query parameter you can't skip it further. I'm looking for any option where I can make my parameter optional. For instance, I have a ta...

  • 16755 Views
  • 13 replies
  • 10 kudos
Latest Reply
techg
New Contributor II
  • 10 kudos

Is there any solution for the above mentioned post?

  • 10 kudos
12 More Replies
mickniz
by Contributor
  • 30404 Views
  • 8 replies
  • 19 kudos

cannot import name 'sql' from 'databricks'

I am working on Databricks version 10.4 premium cluster and while importing sql from databricks module I am getting below error. cannot import name 'sql' from 'databricks' (/databricks/python/lib/python3.8/site-packages/databricks/__init__.py).Trying...

  • 30404 Views
  • 8 replies
  • 19 kudos
Latest Reply
ameet9257
Contributor
  • 19 kudos

if you ever received this kind of error after installing the correct Python package then try running the below command. dbutils.library.restartPython()

  • 19 kudos
7 More Replies
noimeta
by Contributor III
  • 5848 Views
  • 7 replies
  • 4 kudos

Resolved! Databricks SQL: catalog of each query

Currently, we are migrating from hive metastore to UC. We have several dashboards and a huge number of queries whose catalogs have been set to hive_metastore and using <db>.<table> access pattern.I'm just wondering if there's a way to switch catalogs...

  • 5848 Views
  • 7 replies
  • 4 kudos
Latest Reply
h_h_ak
Contributor
  • 4 kudos

May you can also have a look here, if you need hot fix  https://github.com/databrickslabs/ucx 

  • 4 kudos
6 More Replies
Data_Engineer3
by Contributor III
  • 3854 Views
  • 5 replies
  • 0 kudos

Default maximum spark streaming chunk size in delta files in each batch?

working with delta files spark structure streaming , what is the maximum default chunk size in each batch?How do identify this type of spark configuration in databricks?#[Databricks SQL]​ #[Spark streaming]​ #[Spark structured streaming]​ #Spark​ 

  • 3854 Views
  • 5 replies
  • 0 kudos
Latest Reply
NandiniN
Databricks Employee
  • 0 kudos

doc - https://docs.databricks.com/en/structured-streaming/delta-lake.html  Also, what is the challenge while using foreachbatch?

  • 0 kudos
4 More Replies
merca
by Valued Contributor II
  • 10681 Views
  • 12 replies
  • 7 kudos

Value array {{QUERY_RESULT_ROWS}} in Databricks SQL alerts custom template

Please include in documentation an example how to incorporate the `QUERY_RESULT_ROWS` variable in the custom template.

  • 10681 Views
  • 12 replies
  • 7 kudos
Latest Reply
CJK053000
New Contributor III
  • 7 kudos

Databricks confirmed this was an issue on their end and it should be resolved now. It is working for me.

  • 7 kudos
11 More Replies
Graham
by New Contributor III
  • 5511 Views
  • 4 replies
  • 3 kudos

Resolved! Inline comment next to un-tickmarked SET statement = Syntax error

Running this code in databricks SQL works great:SET USE_CACHED_RESULT = FALSE;   -- Result: -- key value -- USE_CACHED_RESULT FALSEIf I add an inline comment, however, I get a syntax error:SET USE_CACHED_RESUL...

  • 5511 Views
  • 4 replies
  • 3 kudos
Latest Reply
rafal_walisko
New Contributor II
  • 3 kudos

Hi, I'm getting the same error when trying to execute statement through API "statement": "SET `USE_CACHED_RESULT` = FALSE; SELECT COUNT(*) FROM TABLE" Every combination fail  "status": { "state": "FAILED", "error": { "e...

  • 3 kudos
3 More Replies
isaac_gritz
by Databricks Employee
  • 4372 Views
  • 5 replies
  • 5 kudos

SQL IDE Support

How to use a SQL IDE with Databricks SQLDatabricks provides SQL IDE support using DataGrip and DBeaver with Databricks SQL.Let us know in the comments if you've used DataDrip or DBeaver with Databricks! Let us know if there are any other SQL IDEs you...

  • 4372 Views
  • 5 replies
  • 5 kudos
Latest Reply
Jag
New Contributor III
  • 5 kudos

dbeaver is perfectly working fine but I fount one issue it wont show the correct error for query. 

  • 5 kudos
4 More Replies
Labels