cancel
Showing results for 
Search instead for 
Did you mean: 
Data Engineering
cancel
Showing results for 
Search instead for 
Did you mean: 

Forum Posts

sp1
by New Contributor II
  • 8316 Views
  • 7 replies
  • 4 kudos

Resolved! Pass date value as parameter in Databricks SQL notebook

I want to pass yesterday date (In the example 20230115*.csv) in the csv file. Don't know how to create parameter and use it here.CREATE OR REPLACE TEMPORARY VIEW abc_delivery_logUSING CSVOPTIONS ( header="true", delimiter=",", inferSchema="true", pat...

  • 8316 Views
  • 7 replies
  • 4 kudos
Latest Reply
Asifpanjwani
  • 4 kudos

@Kaniz @sp1 @Chaitanya_Raju @daniel_sahal Hi Everyone,I need the same scenario on SQL code, because my DBR cluster not allowed me to run python codeError: Unsupported cell during execution. SQL warehouses only support executing SQL cells.I appreciate...

  • 4 kudos
6 More Replies
amitca71
by Contributor II
  • 3271 Views
  • 5 replies
  • 4 kudos

Resolved! exception when using java SQL client

Hi,I try to use java sql. i can see that the query on databricks is executed properly.However, on my client i get exception (see below).versions:jdk: jdk-20.0.1 (tryed also with version 16, same results)https://www.oracle.com/il-en/java/technologies/...

  • 3271 Views
  • 5 replies
  • 4 kudos
Latest Reply
xebia
New Contributor II
  • 4 kudos

I am using java 17 and getting the same error.

  • 4 kudos
4 More Replies
nyehia
by Contributor
  • 2477 Views
  • 9 replies
  • 0 kudos

Can not access a sql file from Notebook

Hey,I have a repo of notebooks and SQL files, the typical way is to update/create notebooks in the repo then push it and CICD pipeline deploys the notebooks to the Shared workspace.the issue is that I can access the SQL files in the Repo but can not ...

tempsnip
  • 2477 Views
  • 9 replies
  • 0 kudos
Latest Reply
ok_1
New Contributor II
  • 0 kudos

ok

  • 0 kudos
8 More Replies
juanc
by New Contributor II
  • 2574 Views
  • 9 replies
  • 2 kudos

Activate spark extensions on SQL Endpoints

It would be possible to activate a custom extensions like Sedona (https://sedona.apache.org/download/databricks/ ) in SQL Endopoints?Example error:java.lang.ClassNotFoundException: org.apache.spark.sql.sedona_sql.UDT.GeometryUDT at org.apache.spark....

  • 2574 Views
  • 9 replies
  • 2 kudos
Latest Reply
naveenanto
New Contributor III
  • 2 kudos

@Kaniz What is the right way to add custom spark extension to sql warehouse clusters?

  • 2 kudos
8 More Replies
BeginnerBob
by New Contributor III
  • 15182 Views
  • 6 replies
  • 3 kudos

Convert Date to YYYYMMDD in databricks sql

Hi,I have a date column in a delta table called ADate. I need this in the format YYYYMMDD.In TSQL this is easy. However, I can't seem to be able to do this without splitting the YEAR, MONTH and Day and concatenating them together.Any ideas?

  • 15182 Views
  • 6 replies
  • 3 kudos
Latest Reply
JayDoubleYou42
New Contributor II
  • 3 kudos

I'll share I'm having a variant of the same issue. I have a varchar field in the form YYYYMMDD which I'm trying to join to another varchar field from another table in the form of MM/DD/YYYY. Does anyone know of a way to do this in SPARK SQL without s...

  • 3 kudos
5 More Replies
VVM
by New Contributor III
  • 8110 Views
  • 13 replies
  • 3 kudos

Resolved! Databricks SQL - Unable to Escape Dollar Sign ($) in Column Name

It seems that due to how Databricks processes SQL cells, it's impossible to escape the $ when it comes to a column name.I would expect the following to work:%sql SELECT 'hi' `$id`The backticks ought to escape everything. And indeed that's exactly wha...

  • 8110 Views
  • 13 replies
  • 3 kudos
Latest Reply
Casper-Bang
New Contributor II
  • 3 kudos

What is the status on this bug report? its been over a year now. 

  • 3 kudos
12 More Replies
gregorymig
by New Contributor III
  • 2855 Views
  • 9 replies
  • 2 kudos

Sql Serverless Option is missing when using Azure Databricks Workspace with No Public IP and VNET Injection

HelloAfter creating an Databricks Workspace in Azure with No Public IP and VNET Injection, I'm unable to use DBSQL Serverless because the option to enable it in SQL warehouse Settings is missing. ​Is it by design? Is it a limitation when using Privat...

  • 2855 Views
  • 9 replies
  • 2 kudos
Latest Reply
zzthatch
New Contributor II
  • 2 kudos

I am starting up with databricks and having the same issue. This is very unexpected, since SQL Serverless is advertised so heavily. I have only seen it noted in one place thus far that restricted networking can preclude you from using this. Please le...

  • 2 kudos
8 More Replies
Constantine
by Contributor III
  • 8367 Views
  • 3 replies
  • 6 kudos

Resolved! CREATE TEMP TABLE FROM CTE

I have written a CTE in Spark SQL WITH temp_data AS (   ......   )   CREATE VIEW AS temp_view FROM SELECT * FROM temp_view; I get a cryptic error. Is there a way to create a temp view from CTE using Spark SQL in databricks?

  • 8367 Views
  • 3 replies
  • 6 kudos
Latest Reply
-werners-
Esteemed Contributor III
  • 6 kudos

In the CTE you can't do a CREATE. It expects an expression in the form of expression_name [ ( column_name [ , ... ] ) ] [ AS ] ( query )where expression_name specifies a name for the common table expression.If you want to create a view from a CTE, y...

  • 6 kudos
2 More Replies
pranathisg97
by New Contributor III
  • 1002 Views
  • 3 replies
  • 1 kudos

Resolved! Control query caching using SQL statement execution API

I want to execute this statement using databricks SQL Statement Execution API. curl -X POST -H 'Authorization: Bearer <access-token>' -H 'Content-Type: application/json' -d '{"warehouse_id": "<warehouse_id>", "statement": "set us...

image.png
  • 1002 Views
  • 3 replies
  • 1 kudos
Latest Reply
TimFrazer
New Contributor II
  • 1 kudos

Did you ever find a solution to this problem?

  • 1 kudos
2 More Replies
noimeta
by Contributor II
  • 2387 Views
  • 7 replies
  • 4 kudos

Resolved! Databricks SQL: catalog of each query

Currently, we are migrating from hive metastore to UC. We have several dashboards and a huge number of queries whose catalogs have been set to hive_metastore and using <db>.<table> access pattern.I'm just wondering if there's a way to switch catalogs...

  • 2387 Views
  • 7 replies
  • 4 kudos
Latest Reply
abdulrahim
New Contributor II
  • 4 kudos

Absolutely accurate, in order to grow your business you need to create an image of your brand such that it is the first thing coming to customers mind when they think about a certain product or service that’s where social media marketing agencies com...

  • 4 kudos
6 More Replies
Cblunck
by New Contributor II
  • 1728 Views
  • 3 replies
  • 0 kudos

New to databricks SQL - where clause issue

Hello community,Using Databricks SQL for the first time and I was hoping I could just copy and past my queries from SSMS across and update the table names, but it's not working.Found it's the where statement, which I updated the ' ' to " " but still ...

image.png
  • 1728 Views
  • 3 replies
  • 0 kudos
Latest Reply
justinghavami
New Contributor II
  • 0 kudos

Hi, were you able to get this figured out? I am having the same issue.

  • 0 kudos
2 More Replies
Sen
by New Contributor
  • 3012 Views
  • 8 replies
  • 1 kudos

Resolved! Performance enhancement while writing dataframes into Parquet tables

Hi,I am trying to write the contents of a dataframe into a parquet table using the command below.df.write.mode("overwrite").format("parquet").saveAsTable("sample_parquet_table")The dataframe contains an extract from one of our source systems, which h...

  • 3012 Views
  • 8 replies
  • 1 kudos
Latest Reply
MichTalebzadeh
Contributor
  • 1 kudos

Hi,I agree with the reply around the benefits of Delta tables, specifically Delta brings additional features,such as ACID transactions and schema evolution. However, I am not sure whether the problem below and I quote "The problem is, this statement ...

  • 1 kudos
7 More Replies
Ela
by New Contributor III
  • 638 Views
  • 1 replies
  • 1 kudos

Checking for availability of dynamic data masking functionality in SQL.

I am looking forward for functionality similar to snowflake which allows attaching masking to a existing column. Documents found related to masking with encryption but my use case is on the existing table. Solutions using views along with Dynamic Vie...

  • 638 Views
  • 1 replies
  • 1 kudos
Latest Reply
sivankumar86
New Contributor II
  • 1 kudos

Unity catalog provide similar feature https://docs.databricks.com/en/data-governance/unity-catalog/row-and-column-filters.html

  • 1 kudos
lawrence009
by Contributor
  • 1208 Views
  • 5 replies
  • 1 kudos

Contact Support re Billing Error

How do I contact billing support? I am billed through AWS Marketplace and noticed last month the SQL Pro discount is not being reflected in my statement.

  • 1208 Views
  • 5 replies
  • 1 kudos
Latest Reply
santiagortiiz
New Contributor III
  • 1 kudos

Hi, could anybody provide a contact email? I have sent emails to many contacts described in the support page here and in AWS, but no response from any channel. My problem is that databricks charged me by the resources used during a free trial, what i...

  • 1 kudos
4 More Replies
ramravi
by Contributor II
  • 5267 Views
  • 2 replies
  • 0 kudos

spark is case sensitive? Spark is not case sensitive by default. If you have same column name in different case (Name, name), if you try to select eit...

spark is case sensitive?Spark is not case sensitive by default. If you have same column name in different case (Name, name), if you try to select either "Name" or "name" column you will get column ambiguity error.There is a way to handle this issue b...

  • 5267 Views
  • 2 replies
  • 0 kudos
Latest Reply
source2sea
Contributor
  • 0 kudos

Hi, even though i set the conf to be true, on writing to disk it had exceptions complaining it has duplicate columns.below is the error message org.apache.spark.sql.AnalysisException: Found duplicate column(s) in the data to save: branchavailablity....

  • 0 kudos
1 More Replies
Labels