@Kayla Pomakoyโ :
I would like you to use DBSQL capability instead of BigQuery. Feel free to get in touch with the SA if you are an existing customer ๐ We can tell you value of using DBSQL.
Asides, yes there is a BigQuery API to create a connection between your Databricks notebook and the BigQuery table. To tackle update or delete specific rows in a BigQuery table, you can use SQL statements such as UPDATE or DELETE with a WHERE clause that specifies the rows to modify. Finally, Read the updated BigQuery table back into Databricks as a DataFrame using the BigQuery Connector for Spark.