cancel
Showing results for 
Search instead for 
Did you mean: 
Data Engineering
cancel
Showing results for 
Search instead for 
Did you mean: 

Forum Posts

MikeK_
by New Contributor II
  • 22025 Views
  • 6 replies
  • 0 kudos

Resolved! SQL Update Join

Hi, I'm importing some data and stored procedures from SQL Server into databricks, I noticed that updates with joins are not supported in Spark SQL, what's the alternative I can use? Here's what I'm trying to do: update t1 set t1.colB=CASE WHEN t2.c...

  • 22025 Views
  • 6 replies
  • 0 kudos
Latest Reply
LyderIversen
New Contributor II
  • 0 kudos

Hi! This is way late, but did you ever find a solution to the CROSS APPLY-part of your question? Is it possible to do CROSS APPLY in Spark SQL, or is there something you can use instead?

  • 0 kudos
5 More Replies
goldentown
by New Contributor III
  • 1811 Views
  • 1 replies
  • 2 kudos

Resolved! The Jupiter note-book doesn't update imports after updating the .py file

Please help. Here's an example:I have one .py file and one .ipynb, and the .py file contains the test function, but after adding the new function test1, it doesn't appear in .ipynb. Even after re-running the .py file and reimporting it in .ipynb. How...

  • 1811 Views
  • 1 replies
  • 2 kudos
Latest Reply
goldentown
New Contributor III
  • 2 kudos

%load_ext autoreload%autoreload 2

  • 2 kudos
Mado
by Valued Contributor II
  • 1765 Views
  • 4 replies
  • 0 kudos

Medallion architecture, how to update Gold tables?

Assume that I have a data source that is ingested to a few bronze tables, and transformed to a silver table. Ans next, a gold table is created by aggregating the silver table. If new records arrive in the data source, bronze and silver tables are upd...

  • 1765 Views
  • 4 replies
  • 0 kudos
Latest Reply
Mado
Valued Contributor II
  • 0 kudos

Hi @Vidula Khanna​ The answer didn't fit my question. In the case of using Merge, I found a good article here:https://medium.com/@avnishjain22/simplify-optimise-and-improve-your-data-pipelines-with-incremental-etl-on-the-lakehouse-61b279afadea

  • 0 kudos
3 More Replies
Kayla
by Contributor
  • 1565 Views
  • 2 replies
  • 0 kudos

BigQuery - Delete or update from Databricks?

I'm trying to sync a delta table in Databricks to a BigQuery table. For the most part, appending is sufficient, but occassionally we need to overwrite rows - which we've only been able to do by overwriting the entire table.Is there any way to do upda...

  • 1565 Views
  • 2 replies
  • 0 kudos
Latest Reply
Anonymous
Not applicable
  • 0 kudos

Hi @Kayla Pomakoy​ Hope all is well! Just wanted to check in if you were able to resolve your issue and would you be happy to share the solution or mark an answer as best? Else please let us know if you need more help. We'd love to hear from you.Than...

  • 0 kudos
1 More Replies
huy_le
by New Contributor II
  • 350 Views
  • 0 replies
  • 1 kudos

How to Update openapi specs ?

I found a few errors/differences between https://docs.databricks.com/_static/api-refs/permissions-2.0-aws.yaml and the actual api. Where can we update this specs for everyone because engineers are using this specs to generate their clients

  • 350 Views
  • 0 replies
  • 1 kudos
deisou
by New Contributor
  • 929 Views
  • 1 replies
  • 0 kudos

How to change Databricks Academy username(Email)?

Is there a way to change my Databricks Academy username(email)? It is greyed out in my profile and I cannot update it. How do I go about getting it updated?

  • 929 Views
  • 1 replies
  • 0 kudos
Latest Reply
Debayan
Esteemed Contributor III
  • 0 kudos

Hi, Please go through the Databricks Academy FAQs here: https://files.training.databricks.com/lms/docebo/databricks-academy-faq.pdfAlso, please go through the post here: https://community.databricks.com/s/feed/0D53f00001dq6W6CAI.

  • 0 kudos
BorislavBlagoev
by Valued Contributor III
  • 2408 Views
  • 9 replies
  • 3 kudos

Resolved! Tring to create incremental pipeline but fails when I try to use outputMode "update"

def upsertToDelta(microBatchOutputDF, batchId): microBatchOutputDF.createOrReplaceTempView("updates")   microBatchOutputDF._jdf.sparkSession().sql(""" MERGE INTO old o USING updates u ON u.id = o.id WHEN MATCHED THEN UPDATE SE...

  • 2408 Views
  • 9 replies
  • 3 kudos
Latest Reply
Hubert-Dudek
Esteemed Contributor III
  • 3 kudos

Delta table/file version is too old. Please try to upgrade it as described here https://docs.microsoft.com/en-us/azure/databricks/delta/versioning​

  • 3 kudos
8 More Replies
William_Scardua
by Valued Contributor
  • 1704 Views
  • 4 replies
  • 3 kudos

Resolved! Update Databricks Runtime

Hy guys,I need to upgrade my databricks runtime (current 8.0 ) What the precautions should I take ?Thank you very much

  • 1704 Views
  • 4 replies
  • 3 kudos
Latest Reply
Steward475
New Contributor II
  • 3 kudos

If you want to know the version of Databricks runtime in Azure after creation: Go to Azure Data bricks portal => Clusters => Interactive Clusters => here you can find the run time version. For more details, refer "Azure Databricks Runtime versions".R...

  • 3 kudos
3 More Replies
Sumeet_Dora
by New Contributor II
  • 965 Views
  • 2 replies
  • 4 kudos

Resolved! Write mode features in Bigquey using Databricks notebook.

Currently using df.write.format("bigquery") ,Databricks only supports append and Overwrite modes in to writing Bigquery tables.Does Databricks has any option of executing the DMLs like Merge in to Bigquey using Databricks Notebooks.?

  • 965 Views
  • 2 replies
  • 4 kudos
Latest Reply
mathan_pillai
Valued Contributor
  • 4 kudos

@Sumeet Dora​ , Unfortunately there is no direct "merge into" option for writing to Bigquery using Databricks notebook. You could write to an intermediate delta table using the "merge into" option in delta table. Then read from the delta table and pe...

  • 4 kudos
1 More Replies
GuidoPereyra_
by New Contributor II
  • 6217 Views
  • 2 replies
  • 0 kudos

Databricks Delta - UPDATE error

Hi, We got the following error when we tried to UPDATE a delta table running concurrent notebooks that all end with an update to the same table. " com.databricks.sql.transaction.tahoe.ConcurrentAppendException: Files were added matching 'true' by a ...

  • 6217 Views
  • 2 replies
  • 0 kudos
Latest Reply
GuidoPereyra_
New Contributor II
  • 0 kudos

Hi @matt@direction.consulting I just found the following doc https://docs.azuredatabricks.net/delta/isolation-level.html#set-the-isolation-level. In my case, I could fixed partitioning the table and I think is the only way for concurrent update in t...

  • 0 kudos
1 More Replies
Labels