On Databricks, I have created a connection type Google Query and tested the connection successfully. I have then created a foreign catalog from the connection to a Google BigQuery project. I can see all the data sets and tables in the Foreign Catalog named gcp_connect_catalog.
However when I run a %sql cell in notebook using a personal compute cluster which performs a selct from a table in the foreign catalog it returns an error with dataset was not found in location US. (The Google BigQuery project resides in North Europe).
When using a SQL serverless cluster I have no issues but SQL cluster cannot be used where python is used. Is there some resolution/setting that needs to be applied to resolve this?