@Sumeet Doraโ , Unfortunately there is no direct "merge into" option for writing to Bigquery using Databricks notebook. You could write to an intermediate delta table using the "merge into" option in delta table. Then read from the delta table and perform a full overwrite to the bigquery table, so that you get the up-to-date latest info in delta table and as well in bigquery. Hope it helps
You can refer to "merge into" option of delta table here https://docs.databricks.com/spark/latest/spark-sql/language-manual/delta-merge-into.html#merge-into-...
Here are some example notebooks for writing to bigquery
https://docs.databricks.com/data/data-sources/google/bigquery.html#example-notebooks
We have filter push down, but those are used while reading the data from bigquery. In the meanwhile, if I find any alternatives, I will post here.