by
gilo12
• New Contributor III
- 10163 Views
- 3 replies
- 2 kudos
I am using the following query to make an upsert:MERGE INTO my_target_table AS target
USING (SELECT MAX(__my_timestamp) AS checkpoint FROM my_source_table) AS source
ON target.name = 'some_name'
AND target.address = 'some_address'
WHEN MATCHED AN...
- 10163 Views
- 3 replies
- 2 kudos
Latest Reply
I was using a view for my_source_table, once I changed that to be a table the issue stoped.That unblocked me, but I think Databricks has a bug with using MERGE INTO from a VIEW
2 More Replies
- 5310 Views
- 1 replies
- 3 kudos
I used this source https://docs.databricks.com/workflows/jobs/jobs.html#:~:text=You%20can%20use%20Run%20Now,different%20values%20for%20existing%20parameters.&text=next%20to%20Run%20Now%20and,on%20the%20type%20of%20task. But there is no example of how...
- 5310 Views
- 1 replies
- 3 kudos
Latest Reply
Hi @Andre Ten That's exactly how you specify the json parameters in databricks workflow. I have been doing in the same format and it works for me..removed the parameters as it is a bit sensitive. But I hope you get the point.Cheers.
- 8178 Views
- 3 replies
- 4 kudos
Current state:Data is stored in MongoDB Atlas which is used extensively by all servicesData lake is hosted in same AWS region and connected to MongoDB over private link Requirements:Streaming pipelines that continuously ingest, transform/analyze and ...
- 8178 Views
- 3 replies
- 4 kudos
Latest Reply
Another option if you'd like to use Spark as the ingestion is to use the new Spark Connector V10.0 which support Spark Structured Streaming. https://www.mongodb.com/developer/languages/python/streaming-data-apache-spark-mongodb/. If you use Kafka, th...
2 More Replies
- 5417 Views
- 3 replies
- 3 kudos
Hi AllI have a requirement to perform updates on a delta table that is the source for a streaming query.I would like to be able to update the table and have the stream continue to work while also not ending up with duplicates.From my research it se...
- 5417 Views
- 3 replies
- 3 kudos
Latest Reply
Hey @Mathew Walters Hope you are doing great.Just wanted to check in if you were able to resolve your issue and would you be happy to share the solution? Else please let us know if you need more help. We'd love to hear from you.Thanks!
2 More Replies
- 2253 Views
- 3 replies
- 3 kudos
Is there an alerting api so that alerts can be source controlled and automated, please ?https://docs.databricks.com/sql/user/alerts/index.html
- 2253 Views
- 3 replies
- 3 kudos
Latest Reply
Dan_Z
Databricks Employee
Hello @Nick Hughes , as of today we do not expose or document the API for these features. I think it will be a useful feature so I created an internal feature request for it (DB-I-4289). If you (or any future readers) want more information on this f...
2 More Replies