cancel
Showing results for 
Search instead for 
Did you mean: 
Data Engineering
cancel
Showing results for 
Search instead for 
Did you mean: 

Forum Posts

DipakBachhav
by New Contributor III
  • 9159 Views
  • 4 replies
  • 3 kudos

Resolved! Geting error Caused by: com.databricks.NotebookExecutionException: FAILED

I am trying to run the below notebook through databricks but getting the below error. I have tried to update the notebook timeout and the retry mechanism but still no luck yet.   NotebookData("/Users/mynotebook",9900, retry=3)   ]   res = parallelNot...

  • 9159 Views
  • 4 replies
  • 3 kudos
Latest Reply
sujai_sparks
New Contributor III
  • 3 kudos

Hi @Dipak Bachhav​, not sure if you have fixed the issue, but here are few things you can check: Is the path "/Users/mynotebook" correct? Maybe you are missing the dot in the beginning.Run the notebook using dbutils.notebook.run("/Users/mynotebook") ...

  • 3 kudos
3 More Replies
Dale_Ware
by New Contributor III
  • 1247 Views
  • 2 replies
  • 3 kudos

Resolved! How to query a table with backslashes in the name.

I am trying to query a snowflake table from a databricks data frame similar to the following example.sql_query = "select * from Database.Schema.Table_/Name_/V"sqlContext.sql(f"{sql_query}" ) And I get an error like this.ParseException: [PARSE_SYNTAX_...

  • 1247 Views
  • 2 replies
  • 3 kudos
Latest Reply
Aviral-Bhardwaj
Esteemed Contributor III
  • 3 kudos

You can use Double Quotes to get the plan. Using quotes it is important to write the table names in capital letters.SELECT * FROM "/TABLE/NAME"

  • 3 kudos
1 More Replies
ramankr48
by Contributor II
  • 11441 Views
  • 5 replies
  • 8 kudos

Resolved! How to get all the tables name with a specific column or columns in a database?

let's say there is a database db in which 700 tables are there, and we need to find all the tables name in which column "project_id" is present.just an example for ubderstanding the questions.

  • 11441 Views
  • 5 replies
  • 8 kudos
Latest Reply
Anonymous
Not applicable
  • 8 kudos

databaseName = "db" desiredColumn = "project_id" database = spark.sql(f"show tables in {databaseName} ").collect() tablenames = [] for row in database: cols = spark.table(row.tableName).columns if desiredColumn in cols: tablenames.append(row....

  • 8 kudos
4 More Replies
William_Scardua
by Valued Contributor
  • 1109 Views
  • 3 replies
  • 1 kudos

Resolved! Upsert When the Origin NOT Exists, but you need to change status in the target

Hi guys,I have a question about upsert/merge ... What do you do when que origin NOT exists, but you need to change status in the target​For exemple:01/03 : source dataset [ id =1 and status = Active] ; target table [*not exists*] >> in this time the ...

  • 1109 Views
  • 3 replies
  • 1 kudos
Latest Reply
NandiniN
Valued Contributor II
  • 1 kudos

Hello @William Scardua​ , Just adding to what @Vigneshraja Palaniraj​ replied.Reference: https://docs.databricks.com/sql/language-manual/delta-merge-into.htmlThanks & Regards,Nandini

  • 1 kudos
2 More Replies
Ovi
by New Contributor III
  • 2453 Views
  • 5 replies
  • 3 kudos

Resolved! Filter only Delta tables from an S3 folders list

Hello everyone,From a list of folders on s3, how can I filter which ones are Delta tables, without trying to read each one at a time?Thanks,Ovi

  • 2453 Views
  • 5 replies
  • 3 kudos
Latest Reply
NandiniN
Valued Contributor II
  • 3 kudos

Hello @Ovidiu Eremia​ ,To filter which folders on S3 contain Delta tables, you can look for the specific files that are associated with Delta tables. Delta Lake stores its metadata in a hidden folder named _delta_log, which is located at the root of ...

  • 3 kudos
4 More Replies
MShee
by New Contributor II
  • 748 Views
  • 1 replies
  • 1 kudos
  • 748 Views
  • 1 replies
  • 1 kudos
Latest Reply
NandiniN
Valued Contributor II
  • 1 kudos

Hello @M Shee​ ,In a drop down you can select a value from a list of provided values, not type the values in. What you might be interested in is a combobox - It is combination of text and dropdown. It allows to select a value from a provided list or ...

  • 1 kudos
Hubert-Dudek
by Esteemed Contributor III
  • 535 Views
  • 1 replies
  • 4 kudos

lnkd.in

Databricks has introduced a new feature that allows users to send SQL statements to their database via REST API. Users can easily integrate this feature with any tool by simply posting their queries to the /api/2.0/sql/statements/ endpoint. With this...

statmentapi
  • 535 Views
  • 1 replies
  • 4 kudos
Latest Reply
Kaniz
Community Manager
  • 4 kudos

Hi @Hubert Dudek​, Your positive feedback and enthusiasm for our products mean a lot to us, and we appreciate your support.We take great pride in creating products that meet the needs of our customers, and your kind words validate our efforts. Your p...

  • 4 kudos
Lu_Wang_SA_DBX
by New Contributor III
  • 683 Views
  • 1 replies
  • 3 kudos

We will host the first Databricks Bay Area User Group meeting in the Databricks Mountain View office on March 14 2:45-5:00 pm PT.We'll have Dave M...

We will host the first Databricks Bay Area User Group meeting in the Databricks Mountain View office on March 14 2:45-5:00 pm PT.We'll have Dave Mariani - CTO & Founder at AtScale, and Riley Phillips - Enterprise Solution Engineer at Matillion to sha...

David Mariana - CTO, AtScale Riley Phillips - Enterprise Solution Engineer, Matillion
  • 683 Views
  • 1 replies
  • 3 kudos
Latest Reply
amitabharora
New Contributor II
  • 3 kudos

Looking forward.

  • 3 kudos
Everton_Costa
by New Contributor II
  • 788 Views
  • 2 replies
  • 1 kudos
  • 788 Views
  • 2 replies
  • 1 kudos
Latest Reply
Cami
Contributor III
  • 1 kudos

I hope it helps:SELECT DATEADD(DAY, rnk - 1, '{{StartDate}}') FROM ( WITH lv0(c) AS( SELECT 1 as c UNION ALL SELECT 1 ) , lv1 AS ( Select t1.c from lv0 t1 cross JOIN lv0 t2 ) , lv2 AS ( Select t1....

  • 1 kudos
1 More Replies
JacintoArias
by New Contributor II
  • 3619 Views
  • 6 replies
  • 2 kudos

Resolved! Spark predicate pushdown on parquet files when using limit

Hi,While developing an ETL for a large dataset I want to get a sample of the top rows to check that my the pipeline "just runs", so I add a limit clause when reading the dataset.I'm surprised to see that instead of creating a single task as in a sho...

  • 3619 Views
  • 6 replies
  • 2 kudos
Latest Reply
JacekLaskowski
New Contributor II
  • 2 kudos

It's been a while since the question was asked, and in the meantime Delta Lake 2.2.0 hit the shelves with the exact feature the OP asked about, i.e. LIMIT pushdown:LIMIT pushdown into Delta scan. Improve the performance of queries containing LIMIT cl...

  • 2 kudos
5 More Replies
rsamant07
by New Contributor III
  • 1098 Views
  • 3 replies
  • 2 kudos

Serverless SQL Cluster giving error with Powerbi

Power bu Giving this error while accessing delta table using serverless sql endpoint. Error while using path /mnt/xyz/_delta_log/00000000000000000000.checkpoint for resolving path '/xyz/_delta_log/00000000000000000000.checkpoint' within mount at '/mn...

  • 1098 Views
  • 3 replies
  • 2 kudos
Latest Reply
Anonymous
Not applicable
  • 2 kudos

Hi @Rahul Samant​ Thank you for your question! To assist you better, please take a moment to review the answer and let me know if it best fits your needs.Please help us select the best solution by clicking on "Select As Best" if it does.Your feedback...

  • 2 kudos
2 More Replies
maaaxx
by New Contributor III
  • 879 Views
  • 3 replies
  • 4 kudos

A customized python library in cluster to access ADLS vis secret

Hello dear community,in our current project, we would like to develop a customized python library and deploy this library to all of the cluster to manage access control. You might ask why via a conventional way like external storage, well, we do not ...

  • 879 Views
  • 3 replies
  • 4 kudos
Latest Reply
Anonymous
Not applicable
  • 4 kudos

Hi @Yuan Gao​ Hope everything is going great.Just wanted to check in if you were able to resolve your issue. If yes, would you be happy to mark an answer as best so that other members can find the solution more quickly? If not, please tell us so we c...

  • 4 kudos
2 More Replies
andrcami1990
by New Contributor II
  • 3317 Views
  • 2 replies
  • 2 kudos

Resolved! Connect GraphQL to Data Bricks

Hi I am new to Databricks however I need to expose data found in the delta lake directly to GraphQL to be queried by several applications. Is there a connector or something similar to GraphQL that works with Databricks?

  • 3317 Views
  • 2 replies
  • 2 kudos
Latest Reply
Anonymous
Not applicable
  • 2 kudos

Hi @Andrew Camilleri​ Thank you for your question! To assist you better, please take a moment to review the answer and let me know if it best fits your needs.Please help us select the best solution by clicking on "Select As Best" if it does.Your feed...

  • 2 kudos
1 More Replies
maxutil
by New Contributor II
  • 3124 Views
  • 2 replies
  • 3 kudos

Resolved! SQL select string and turn it into a decimal

select col as original, col::double as val_double, col::float as val_float, col::decimal(10,4) as val_decimal, to_number(col, '99999.99999') as val_tonum from int_fx_conversion_rate;The original value of col is a string such as '1...

  • 3124 Views
  • 2 replies
  • 3 kudos
Latest Reply
Anonymous
Not applicable
  • 3 kudos

Hi @Chris Chung​ Thank you for your question! To assist you better, please take a moment to review the answer and let me know if it best fits your needs.Please help us select the best solution by clicking on "Select As Best" if it does.Your feedback ...

  • 3 kudos
1 More Replies
JaiT
by New Contributor II
  • 840 Views
  • 2 replies
  • 2 kudos

Resolved! DataBricks Workspace Environment

Hi, I am new to DataBricks and have started learning about it. I wanted to know if I can use the DataBricks workspace without the 3 Cloud Providers, i.e. AWS, Azure and GCP.If yes, then how?

  • 840 Views
  • 2 replies
  • 2 kudos
Latest Reply
Anonymous
Not applicable
  • 2 kudos

Hi @Jai Chitkara​ Thank you for your question! To assist you better, please take a moment to review the answer and let me know if it best fits your needs.Please help us select the best solution by clicking on "Select As Best" if it does.Your feedback...

  • 2 kudos
1 More Replies
Labels
Top Kudoed Authors