cancel
Showing results for 
Search instead for 
Did you mean: 
Data Engineering
Join discussions on data engineering best practices, architectures, and optimization strategies within the Databricks Community. Exchange insights and solutions with fellow data engineers.
cancel
Showing results for 
Search instead for 
Did you mean: 

Forum Posts

DipakBachhav
by New Contributor III
  • 17475 Views
  • 3 replies
  • 3 kudos

Resolved! Geting error Caused by: com.databricks.NotebookExecutionException: FAILED

I am trying to run the below notebook through databricks but getting the below error. I have tried to update the notebook timeout and the retry mechanism but still no luck yet.   NotebookData("/Users/mynotebook",9900, retry=3)   ]   res = parallelNot...

  • 17475 Views
  • 3 replies
  • 3 kudos
Latest Reply
sujai_sparks
New Contributor III
  • 3 kudos

Hi @Dipak Bachhav​, not sure if you have fixed the issue, but here are few things you can check: Is the path "/Users/mynotebook" correct? Maybe you are missing the dot in the beginning.Run the notebook using dbutils.notebook.run("/Users/mynotebook") ...

  • 3 kudos
2 More Replies
ossinova
by Contributor II
  • 2558 Views
  • 1 replies
  • 2 kudos

PIVOT on month and quarter

I want to simplify this query:SELECT year(EntryDate) Year, AccountNumber, sum(CreditBase - DebitBase) FILTER(WHERE month(EntryDate) = 1) AS jan_total, sum(CreditBase - DebitBase) FILTER(WHERE month(EntryDate) = 2) AS feb_total, sum(CreditBase - Debi...

  • 2558 Views
  • 1 replies
  • 2 kudos
Latest Reply
Lakshay
Databricks Employee
  • 2 kudos

Hi @Oscar Dyremyhr​ , PIVOT doesn't support two FOR clauses. You can PIVOT either on month or on quarter.https://docs.databricks.com/sql/language-manual/sql-ref-syntax-qry-select-pivot.html

  • 2 kudos
Dale_Ware
by New Contributor III
  • 5590 Views
  • 2 replies
  • 3 kudos

Resolved! How to query a table with backslashes in the name.

I am trying to query a snowflake table from a databricks data frame similar to the following example.sql_query = "select * from Database.Schema.Table_/Name_/V"sqlContext.sql(f"{sql_query}" ) And I get an error like this.ParseException: [PARSE_SYNTAX_...

  • 5590 Views
  • 2 replies
  • 3 kudos
Latest Reply
Aviral-Bhardwaj
Esteemed Contributor III
  • 3 kudos

You can use Double Quotes to get the plan. Using quotes it is important to write the table names in capital letters.SELECT * FROM "/TABLE/NAME"

  • 3 kudos
1 More Replies
ramankr48
by Contributor II
  • 21614 Views
  • 5 replies
  • 8 kudos

Resolved! How to get all the tables name with a specific column or columns in a database?

let's say there is a database db in which 700 tables are there, and we need to find all the tables name in which column "project_id" is present.just an example for ubderstanding the questions.

  • 21614 Views
  • 5 replies
  • 8 kudos
Latest Reply
Anonymous
Not applicable
  • 8 kudos

databaseName = "db" desiredColumn = "project_id" database = spark.sql(f"show tables in {databaseName} ").collect() tablenames = [] for row in database: cols = spark.table(row.tableName).columns if desiredColumn in cols: tablenames.append(row....

  • 8 kudos
4 More Replies
William_Scardua
by Valued Contributor
  • 3699 Views
  • 3 replies
  • 1 kudos

Resolved! Upsert When the Origin NOT Exists, but you need to change status in the target

Hi guys,I have a question about upsert/merge ... What do you do when que origin NOT exists, but you need to change status in the target​For exemple:01/03 : source dataset [ id =1 and status = Active] ; target table [*not exists*] >> in this time the ...

  • 3699 Views
  • 3 replies
  • 1 kudos
Latest Reply
NandiniN
Databricks Employee
  • 1 kudos

Hello @William Scardua​ , Just adding to what @Vigneshraja Palaniraj​ replied.Reference: https://docs.databricks.com/sql/language-manual/delta-merge-into.htmlThanks & Regards,Nandini

  • 1 kudos
2 More Replies
Ovi
by New Contributor III
  • 6662 Views
  • 5 replies
  • 3 kudos

Resolved! Filter only Delta tables from an S3 folders list

Hello everyone,From a list of folders on s3, how can I filter which ones are Delta tables, without trying to read each one at a time?Thanks,Ovi

  • 6662 Views
  • 5 replies
  • 3 kudos
Latest Reply
NandiniN
Databricks Employee
  • 3 kudos

Hello @Ovidiu Eremia​ ,To filter which folders on S3 contain Delta tables, you can look for the specific files that are associated with Delta tables. Delta Lake stores its metadata in a hidden folder named _delta_log, which is located at the root of ...

  • 3 kudos
4 More Replies
Dataengineer_mm
by New Contributor
  • 3993 Views
  • 1 replies
  • 1 kudos

Surrogate key using identity column.

I want to create a surrogate in the delta table And i used the identity column id-Generated as DefaultCan i insert rows into the delta table using only spark.sql like Insert query ? or i can also use write delta format options? If i use the df.write ...

  • 3993 Views
  • 1 replies
  • 1 kudos
Latest Reply
NandiniN
Databricks Employee
  • 1 kudos

Hello @Menaka Murugesan​ ,If you are using the identity column, I believe you would have created the table as below, (starts with value 1 and step 1)CREATE TABLE my_table ( id INT IDENTITY (1, 1) PRIMARY KEY, value STRING )You can insert values i...

  • 1 kudos
sanjay
by Valued Contributor II
  • 10972 Views
  • 3 replies
  • 5 kudos

Resolved! PySpark UDF is taking long to process

Hi,I have UDF which runs for each spark dataframe row, does some complex processing and return string output. But it takes very long if data is 15000 rows. I have configured cluster with autoscaling, but its not spinning more servers.Please suggest h...

  • 10972 Views
  • 3 replies
  • 5 kudos
Latest Reply
Lakshay
Databricks Employee
  • 5 kudos

Hi @Sanjay Jain​ , Python UDFs are generally slower to process because it runs mostly in the driver which can also lead to OOM errors on Driver. To resolve this issue, please consider the below:Use spark built-in functions to do the same functionalit...

  • 5 kudos
2 More Replies
MShee
by New Contributor II
  • 2778 Views
  • 1 replies
  • 1 kudos
  • 2778 Views
  • 1 replies
  • 1 kudos
Latest Reply
NandiniN
Databricks Employee
  • 1 kudos

Hello @M Shee​ ,In a drop down you can select a value from a list of provided values, not type the values in. What you might be interested in is a combobox - It is combination of text and dropdown. It allows to select a value from a provided list or ...

  • 1 kudos
Lu_Wang_SA_DBX
by Databricks Employee
  • 8509 Views
  • 1 replies
  • 3 kudos

We will host the first Databricks Bay Area User Group meeting in the Databricks Mountain View office on March 14 2:45-5:00 pm PT.We'll have Dave M...

We will host the first Databricks Bay Area User Group meeting in the Databricks Mountain View office on March 14 2:45-5:00 pm PT.We'll have Dave Mariani - CTO & Founder at AtScale, and Riley Phillips - Enterprise Solution Engineer at Matillion to sha...

David Mariana - CTO, AtScale Riley Phillips - Enterprise Solution Engineer, Matillion
  • 8509 Views
  • 1 replies
  • 3 kudos
Latest Reply
amitabharora
Databricks Employee
  • 3 kudos

Looking forward.

  • 3 kudos
Everton_Costa
by New Contributor II
  • 2787 Views
  • 2 replies
  • 1 kudos
  • 2787 Views
  • 2 replies
  • 1 kudos
Latest Reply
Cami
Contributor III
  • 1 kudos

I hope it helps:SELECT DATEADD(DAY, rnk - 1, '{{StartDate}}') FROM ( WITH lv0(c) AS( SELECT 1 as c UNION ALL SELECT 1 ) , lv1 AS ( Select t1.c from lv0 t1 cross JOIN lv0 t2 ) , lv2 AS ( Select t1....

  • 1 kudos
1 More Replies
JacintoArias
by New Contributor III
  • 10467 Views
  • 5 replies
  • 1 kudos

Spark predicate pushdown on parquet files when using limit

Hi,While developing an ETL for a large dataset I want to get a sample of the top rows to check that my the pipeline "just runs", so I add a limit clause when reading the dataset.I'm surprised to see that instead of creating a single task as in a sho...

  • 10467 Views
  • 5 replies
  • 1 kudos
Latest Reply
JacekLaskowski
Databricks MVP
  • 1 kudos

It's been a while since the question was asked, and in the meantime Delta Lake 2.2.0 hit the shelves with the exact feature the OP asked about, i.e. LIMIT pushdown:LIMIT pushdown into Delta scan. Improve the performance of queries containing LIMIT cl...

  • 1 kudos
4 More Replies
rsamant07
by New Contributor III
  • 3079 Views
  • 3 replies
  • 2 kudos

Serverless SQL Cluster giving error with Powerbi

Power bu Giving this error while accessing delta table using serverless sql endpoint. Error while using path /mnt/xyz/_delta_log/00000000000000000000.checkpoint for resolving path '/xyz/_delta_log/00000000000000000000.checkpoint' within mount at '/mn...

  • 3079 Views
  • 3 replies
  • 2 kudos
Latest Reply
Anonymous
Not applicable
  • 2 kudos

Hi @Rahul Samant​ Thank you for your question! To assist you better, please take a moment to review the answer and let me know if it best fits your needs.Please help us select the best solution by clicking on "Select As Best" if it does.Your feedback...

  • 2 kudos
2 More Replies
maaaxx
by New Contributor III
  • 3822 Views
  • 3 replies
  • 4 kudos

A customized python library in cluster to access ADLS vis secret

Hello dear community,in our current project, we would like to develop a customized python library and deploy this library to all of the cluster to manage access control. You might ask why via a conventional way like external storage, well, we do not ...

  • 3822 Views
  • 3 replies
  • 4 kudos
Latest Reply
Anonymous
Not applicable
  • 4 kudos

Hi @Yuan Gao​ Hope everything is going great.Just wanted to check in if you were able to resolve your issue. If yes, would you be happy to mark an answer as best so that other members can find the solution more quickly? If not, please tell us so we c...

  • 4 kudos
2 More Replies
andrcami1990
by New Contributor II
  • 10107 Views
  • 2 replies
  • 2 kudos

Resolved! Connect GraphQL to Data Bricks

Hi I am new to Databricks however I need to expose data found in the delta lake directly to GraphQL to be queried by several applications. Is there a connector or something similar to GraphQL that works with Databricks?

  • 10107 Views
  • 2 replies
  • 2 kudos
Latest Reply
Anonymous
Not applicable
  • 2 kudos

Hi @Andrew Camilleri​ Thank you for your question! To assist you better, please take a moment to review the answer and let me know if it best fits your needs.Please help us select the best solution by clicking on "Select As Best" if it does.Your feed...

  • 2 kudos
1 More Replies

Join Us as a Local Community Builder!

Passionate about hosting events and connecting people? Help us grow a vibrant local community—sign up today to get started!

Sign Up Now
Labels