cancel
Showing results for 
Search instead for 
Did you mean: 
Data Engineering
Join discussions on data engineering best practices, architectures, and optimization strategies within the Databricks Community. Exchange insights and solutions with fellow data engineers.
cancel
Showing results for 
Search instead for 
Did you mean: 

Forum Posts

gilo12
by New Contributor III
  • 9638 Views
  • 3 replies
  • 2 kudos

merge into deletes from SOURCE

I am using the following query to make an upsert:MERGE INTO my_target_table AS target USING (SELECT MAX(__my_timestamp) AS checkpoint FROM my_source_table) AS source ON target.name = 'some_name' AND target.address = 'some_address' WHEN MATCHED AN...

  • 9638 Views
  • 3 replies
  • 2 kudos
Latest Reply
gilo12
New Contributor III
  • 2 kudos

I was using a view for my_source_table, once I changed that to be a table the issue stoped.That unblocked me, but I think Databricks has a bug with using MERGE INTO from a VIEW

  • 2 kudos
2 More Replies
andreiten
by New Contributor II
  • 4756 Views
  • 1 replies
  • 3 kudos

Is there any example or guideline how to pass JSON parameters to the pipeline in Databricks workflow?

I used this source https://docs.databricks.com/workflows/jobs/jobs.html#:~:text=You%20can%20use%20Run%20Now,different%20values%20for%20existing%20parameters.&text=next%20to%20Run%20Now%20and,on%20the%20type%20of%20task. But there is no example of how...

  • 4756 Views
  • 1 replies
  • 3 kudos
Latest Reply
UmaMahesh1
Honored Contributor III
  • 3 kudos

Hi @Andre Ten​ That's exactly how you specify the json parameters in databricks workflow. I have been doing in the same format and it works for me..removed the parameters as it is a bit sensitive. But I hope you get the point.Cheers.

  • 3 kudos
amichel
by New Contributor III
  • 7712 Views
  • 3 replies
  • 4 kudos

Resolved! Recommended way to integrate MongoDB as a streaming source

Current state:Data is stored in MongoDB Atlas which is used extensively by all servicesData lake is hosted in same AWS region and connected to MongoDB over private link Requirements:Streaming pipelines that continuously ingest, transform/analyze and ...

  • 7712 Views
  • 3 replies
  • 4 kudos
Latest Reply
robwma
New Contributor III
  • 4 kudos

Another option if you'd like to use Spark as the ingestion is to use the new Spark Connector V10.0 which support Spark Structured Streaming. https://www.mongodb.com/developer/languages/python/streaming-data-apache-spark-mongodb/. If you use Kafka, th...

  • 4 kudos
2 More Replies
Confused
by New Contributor III
  • 5145 Views
  • 3 replies
  • 3 kudos

Resolved! Dealing with updates to a delta table being used as a streaming source

Hi AllI have a requirement to perform updates on a delta table that is the source for a streaming query.I would like to be able to update the table and have the stream continue to work while also not ending up with duplicates.From my research it se...

  • 5145 Views
  • 3 replies
  • 3 kudos
Latest Reply
Anonymous
Not applicable
  • 3 kudos

Hey @Mathew Walters​ Hope you are doing great.Just wanted to check in if you were able to resolve your issue and would you be happy to share the solution? Else please let us know if you need more help. We'd love to hear from you.Thanks!

  • 3 kudos
2 More Replies
Nick_Hughes
by New Contributor III
  • 2124 Views
  • 3 replies
  • 3 kudos

Is there an alerting API please?

Is there an alerting api so that alerts can be source controlled and automated, please ?https://docs.databricks.com/sql/user/alerts/index.html

  • 2124 Views
  • 3 replies
  • 3 kudos
Latest Reply
Dan_Z
Databricks Employee
  • 3 kudos

Hello @Nick Hughes​ , as of today we do not expose or document the API for these features. I think it will be a useful feature so I created an internal feature request for it (DB-I-4289). If you (or any future readers) want more information on this f...

  • 3 kudos
2 More Replies
Labels