- 4087 Views
- 1 replies
- 2 kudos
Please help. Here's an example:I have one .py file and one .ipynb, and the .py file contains the test function, but after adding the new function test1, it doesn't appear in .ipynb. Even after re-running the .py file and reimporting it in .ipynb. How...
- 4087 Views
- 1 replies
- 2 kudos
Latest Reply
%load_ext autoreload%autoreload 2
by
Mado
• Valued Contributor II
- 3798 Views
- 4 replies
- 0 kudos
Assume that I have a data source that is ingested to a few bronze tables, and transformed to a silver table. Ans next, a gold table is created by aggregating the silver table. If new records arrive in the data source, bronze and silver tables are upd...
- 3798 Views
- 4 replies
- 0 kudos
Latest Reply
Mado
Valued Contributor II
Hi @Vidula Khanna The answer didn't fit my question. In the case of using Merge, I found a good article here:https://medium.com/@avnishjain22/simplify-optimise-and-improve-your-data-pipelines-with-incremental-etl-on-the-lakehouse-61b279afadea
3 More Replies
by
Kayla
• Valued Contributor II
- 3600 Views
- 2 replies
- 0 kudos
I'm trying to sync a delta table in Databricks to a BigQuery table. For the most part, appending is sufficient, but occassionally we need to overwrite rows - which we've only been able to do by overwriting the entire table.Is there any way to do upda...
- 3600 Views
- 2 replies
- 0 kudos
Latest Reply
Hi @Kayla Pomakoy Hope all is well! Just wanted to check in if you were able to resolve your issue and would you be happy to share the solution or mark an answer as best? Else please let us know if you need more help. We'd love to hear from you.Than...
1 More Replies
by
huy_le
• New Contributor II
- 882 Views
- 0 replies
- 1 kudos
I found a few errors/differences between https://docs.databricks.com/_static/api-refs/permissions-2.0-aws.yaml and the actual api. Where can we update this specs for everyone because engineers are using this specs to generate their clients
- 882 Views
- 0 replies
- 1 kudos
- 2791 Views
- 1 replies
- 0 kudos
Is there a way to change my Databricks Academy username(email)? It is greyed out in my profile and I cannot update it. How do I go about getting it updated?
- 2791 Views
- 1 replies
- 0 kudos
Latest Reply
Hi, Please go through the Databricks Academy FAQs here: https://files.training.databricks.com/lms/docebo/databricks-academy-faq.pdfAlso, please go through the post here: https://community.databricks.com/s/feed/0D53f00001dq6W6CAI.
- 5372 Views
- 9 replies
- 3 kudos
def upsertToDelta(microBatchOutputDF, batchId):
microBatchOutputDF.createOrReplaceTempView("updates")
microBatchOutputDF._jdf.sparkSession().sql("""
MERGE INTO old o
USING updates u
ON u.id = o.id
WHEN MATCHED THEN UPDATE SE...
- 5372 Views
- 9 replies
- 3 kudos
Latest Reply
Delta table/file version is too old. Please try to upgrade it as described here https://docs.microsoft.com/en-us/azure/databricks/delta/versioning
8 More Replies
- 6601 Views
- 4 replies
- 3 kudos
Hy guys,I need to upgrade my databricks runtime (current 8.0 ) What the precautions should I take ?Thank you very much
- 6601 Views
- 4 replies
- 3 kudos
Latest Reply
If you want to know the version of Databricks runtime in Azure after creation: Go to Azure Data bricks portal => Clusters => Interactive Clusters => here you can find the run time version. For more details, refer "Azure Databricks Runtime versions".R...
3 More Replies
- 2265 Views
- 1 replies
- 3 kudos
Currently using df.write.format("bigquery") ,Databricks only supports append and Overwrite modes in to writing Bigquery tables.Does Databricks has any option of executing the DMLs like Merge in to Bigquey using Databricks Notebooks.?
- 2265 Views
- 1 replies
- 3 kudos
Latest Reply
@Sumeet Dora , Unfortunately there is no direct "merge into" option for writing to Bigquery using Databricks notebook. You could write to an intermediate delta table using the "merge into" option in delta table. Then read from the delta table and pe...
- 7456 Views
- 2 replies
- 0 kudos
Hi,
We got the following error when we tried to UPDATE a delta table running concurrent notebooks that all end with an update to the same table.
"
com.databricks.sql.transaction.tahoe.ConcurrentAppendException: Files were added matching 'true' by a ...
- 7456 Views
- 2 replies
- 0 kudos
Latest Reply
Hi @matt@direction.consulting
I just found the following doc https://docs.azuredatabricks.net/delta/isolation-level.html#set-the-isolation-level.
In my case, I could fixed partitioning the table and I think is the only way for concurrent update in t...
1 More Replies