cancel
Showing results for 
Search instead for 
Did you mean: 
Data Engineering
Join discussions on data engineering best practices, architectures, and optimization strategies within the Databricks Community. Exchange insights and solutions with fellow data engineers.
cancel
Showing results for 
Search instead for 
Did you mean: 

Forum Posts

Erik
by Valued Contributor III
  • 19756 Views
  • 19 replies
  • 15 kudos

How to enable/verify cloud fetch from PowerBI

I tried to benchmark the Powerbi Databricks connector vs the powerbi Delta Lake reader on a dataset of 2.15million rows. I found that the delta lake reader used 20 seconds, while importing through the SQL compute endpoint took ~75 seconds. When I loo...

query_statistics query_profile_tree_view
  • 19756 Views
  • 19 replies
  • 15 kudos
Latest Reply
datadrivenangel
New Contributor
  • 15 kudos

I'm troubleshooting slow speeds (~6Mbps) from Azure Databricks to the PowerBI Service (Fabric) via dataflows.Drivers are up to date. PowerBI is using Microsoft's Spark ODBC driver Version 2.7.6.1014, confirmed via log4j.HybridCloudStoreResultHandler...

  • 15 kudos
18 More Replies
Mado
by Valued Contributor II
  • 36851 Views
  • 4 replies
  • 3 kudos

Resolved! How to set a variable and use it in a SQL query

I want to define a variable and use it in a query, like below: %sql   SET database_name = "marketing"; SHOW TABLES in '${database_name}';However, I get the following error:ParseException: [PARSE_SYNTAX_ERROR] Syntax error at or near ''''(line 1, pos...

  • 36851 Views
  • 4 replies
  • 3 kudos
Latest Reply
olufemi_anthony
Databricks Employee
  • 3 kudos

Updated usage as of DBR 15.2+https://docs.databricks.com/en/notebooks/widgets.html#use-widget-values-in-spark-sql-and-sql-warehouse

  • 3 kudos
3 More Replies
SQL
by New Contributor II
  • 2497 Views
  • 6 replies
  • 1 kudos

Presto hive table to delta table conversion

Hi Everyone, I am using the below sql query to generate the days in order in hive & it is working fine. The table got migrated to delta and my query is failing. It would be appreciated if someone helps me to figure out the issue.SQL Query :with  ex...

  • 2497 Views
  • 6 replies
  • 1 kudos
Latest Reply
thelogicplus
Contributor
  • 1 kudos

Hi @SQL @jose_gonzalez , Have you tried code conversion tool fromTravinto technologies  ? They have hive to delta table conversion 

  • 1 kudos
5 More Replies
najmead
by Contributor
  • 20915 Views
  • 7 replies
  • 13 kudos

How to convert string to datetime with correct timezone?

I have a field stored as a string in the format "12/30/2022 10:30:00 AM"If I use the function TO_DATE, I only get the date part... I want the full date and time.If I use the function TO_TIMESTAMP, I get the date and time, but it's assumed to be UTC, ...

  • 20915 Views
  • 7 replies
  • 13 kudos
Latest Reply
Rajeev_Basu
Contributor III
  • 13 kudos

use from_utc_timestamp(to_timestam("<string>", <format>),<timezone>)

  • 13 kudos
6 More Replies
LightUp
by New Contributor III
  • 7274 Views
  • 3 replies
  • 4 kudos

Converting SQL Code to SQL Databricks

I am new to Databricks. Please excuse my ignorance. My requirement is to convert the SQL query below into Databricks SQL. The query comes from EventLog table and the output of the query goes into EventSummaryThese queries can be found hereCREATE TABL...

image
  • 7274 Views
  • 3 replies
  • 4 kudos
Latest Reply
thelogicplus
Contributor
  • 4 kudos

you may explore the tool and services from Travinto Technologies . They have very good tools. We had explored their tool for our code coversion from  Informatica, Datastage and abi initio to DATABRICKS , pyspark. Also we used for SQL queries, stored ...

  • 4 kudos
2 More Replies
gillzer84
by New Contributor
  • 4565 Views
  • 4 replies
  • 5 kudos

An example how to connect to SQL Server data using windows authentication

We use SQL Server to store data. I would like to connect to SQL to pull manipulate and sometimes push data back. I've seen some examples online of connecting but I cannot successfully re-create.

  • 4565 Views
  • 4 replies
  • 5 kudos
Latest Reply
Junee
New Contributor III
  • 5 kudos

You can use jTDS library from maven, add this to your cluster. Once installed, you can write the below code to connect to your Database.Code in Scala will be:import java.util.Properties   val driverClass = "net.sourceforge.jtds.jdbc.Driver" val serve...

  • 5 kudos
3 More Replies
Jyo777
by Contributor
  • 5594 Views
  • 7 replies
  • 4 kudos

need help with Azure Databricks questions on CTE and SQL syntax within notebooks

Hi amazing community folks,Feel free to share your experience or knowledge regarding below questions:-1.) Can we pass a CTE sql statement into spark jdbc? i tried to do it i couldn't but i can pass normal sql (Select * from ) and it works. i heard th...

  • 5594 Views
  • 7 replies
  • 4 kudos
Latest Reply
Rjdudley
Valued Contributor
  • 4 kudos

Not a comparison, but there is a DB-SQL cheatsheet at https://www.databricks.com/sites/default/files/2023-09/databricks-sql-cheatsheet.pdf/

  • 4 kudos
6 More Replies
183530
by New Contributor III
  • 1493 Views
  • 2 replies
  • 2 kudos

How to search an array of words in a text field

Example:TABLE 1FIELD_TEXTI like salty food and Italian foodI have Italian foodbread, rice and beansmexican foodscoke, spritearray['italia', 'mex','coke']match TABLE1 X ARRAYResults:I like salty food and Italian foodI have Italian foodmexican foodsis ...

  • 1493 Views
  • 2 replies
  • 2 kudos
Latest Reply
Meredithharper
New Contributor II
  • 2 kudos

Yes, you can do it in SQL with LIKE or IN and in PySpark using array contains, ideal for filtering Words like halal catering Barcelona, catering, and many more

  • 2 kudos
1 More Replies
Sen
by New Contributor
  • 9297 Views
  • 9 replies
  • 1 kudos

Resolved! Performance enhancement while writing dataframes into Parquet tables

Hi,I am trying to write the contents of a dataframe into a parquet table using the command below.df.write.mode("overwrite").format("parquet").saveAsTable("sample_parquet_table")The dataframe contains an extract from one of our source systems, which h...

  • 9297 Views
  • 9 replies
  • 1 kudos
Latest Reply
jhoon
New Contributor II
  • 1 kudos

Great discussion on performance optimization! Managing technical projects like these alongside academic work can be demanding. If you need expert academic support to free up time for your professional pursuits, Dissertation Help Services is here to a...

  • 1 kudos
8 More Replies
dimsh
by Contributor
  • 14797 Views
  • 13 replies
  • 10 kudos

How to overcome missing query parameters in Databricks SQL?

Hi, there! I'm trying to build up my first dashboard based on Dataabricks SQL. As far as I can see if you define a query parameter you can't skip it further. I'm looking for any option where I can make my parameter optional. For instance, I have a ta...

  • 14797 Views
  • 13 replies
  • 10 kudos
Latest Reply
techg
New Contributor II
  • 10 kudos

Is there any solution for the above mentioned post?

  • 10 kudos
12 More Replies
noimeta
by Contributor III
  • 5095 Views
  • 9 replies
  • 4 kudos

Resolved! Databricks SQL: catalog of each query

Currently, we are migrating from hive metastore to UC. We have several dashboards and a huge number of queries whose catalogs have been set to hive_metastore and using <db>.<table> access pattern.I'm just wondering if there's a way to switch catalogs...

  • 5095 Views
  • 9 replies
  • 4 kudos
Latest Reply
h_h_ak
Contributor
  • 4 kudos

May you can also have a look here, if you need hot fix  https://github.com/databrickslabs/ucx 

  • 4 kudos
8 More Replies
swzzzsw
by New Contributor III
  • 4942 Views
  • 6 replies
  • 0 kudos

Resolved! SQLServerException: deadlock

I'm using databricks to connect to a SQL managed instance via JDBC. SQL operations I need to perform include DELETE, UPDATE, and simple read and write. Since spark syntax only handles simple read and write, I had to open SQL connection using Scala an...

image.png
  • 4942 Views
  • 6 replies
  • 0 kudos
Latest Reply
Panda
Valued Contributor
  • 0 kudos

@swzzzsw Since you are performing database operations, to reduce the chances of deadlocks, make sure to wrap your SQL operations inside transactions using commit and rollback.Another approachs to consider is adding retry logic or using Isolation Leve...

  • 0 kudos
5 More Replies
elgeo
by Valued Contributor II
  • 29003 Views
  • 5 replies
  • 2 kudos

SQL Stored Procedure in Databricks

Hello. Is there an equivalent of SQL stored procedure in Databricks? Please note that I need a procedure that allows DML statements and not only Select statement as a function provides.Thank you in advance

  • 29003 Views
  • 5 replies
  • 2 kudos
Latest Reply
Biswajit
New Contributor II
  • 2 kudos

I have recently went through a video link from databricks which says its possible. But, when I tried it, it did not worked.https://www.youtube.com/watch?v=f4TxNBfSNqMWas anyone able to create stored procedure in databricks ?

  • 2 kudos
4 More Replies
Ericsson
by New Contributor II
  • 3220 Views
  • 3 replies
  • 1 kudos

SQL week format issue its not showing result as 01(ww)

Hi Folks,I've requirement to show the week number as ww format. Please see the below codeselect weekofyear(date_add(to_date(current_date, 'yyyyMMdd'), +35)). also plz refre the screen shot for result.

result
  • 3220 Views
  • 3 replies
  • 1 kudos
Latest Reply
Feltonrolfson
New Contributor II
  • 1 kudos

It seems you're encountering an issue with SQL week formatting, where the results aren't displaying as expected (01(ww)). This could impact data analysis. monkey mart

  • 1 kudos
2 More Replies
AlexDavies
by Contributor
  • 7921 Views
  • 9 replies
  • 2 kudos

Report on SQL queries that are being executed

We have a SQL workspace with a cluster running that services a number of self service reports against a range of datasets. We want to be able to analyse and report on the queries our self service users are executing so we can get better visibility of...

  • 7921 Views
  • 9 replies
  • 2 kudos
Latest Reply
Anonymous
Not applicable
  • 2 kudos

Hey there @Alex Davies​ Hope you are doing great. Just checking in if you were able to resolve your issue or do you need more help? We'd love to hear from you.Thanks!

  • 2 kudos
8 More Replies
Labels