- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
02-03-2025 02:03 AM
On Databricks, I have created a connection type Google Query and tested the connection successfully. I have then created a foreign catalog from the connection to a Google BigQuery project. I can see all the data sets and tables in the Foreign Catalog named gcp_connect_catalog.
However when I run a %sql cell in notebook using a personal compute cluster which performs a selct from a table in the foreign catalog it returns an error with dataset was not found in location US. (The Google BigQuery project resides in North Europe).
When using a SQL serverless cluster I have no issues but SQL cluster cannot be used where python is used. Is there some resolution/setting that needs to be applied to resolve this?
Accepted Solutions
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
02-03-2025 04:51 AM
Hi @GinoBarkley,
Could you please advise which DBR version are you using in your personal access mode?
I see this requirments:
-
Databricks clusters must use Databricks Runtime 16.1 or above and shared or single user access mode.
-
SQL warehouses must be Pro or Serverless.
https://docs.databricks.com/en/query-federation/bigquery.html
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
02-03-2025 04:51 AM
Hi @GinoBarkley,
Could you please advise which DBR version are you using in your personal access mode?
I see this requirments:
-
Databricks clusters must use Databricks Runtime 16.1 or above and shared or single user access mode.
-
SQL warehouses must be Pro or Serverless.
https://docs.databricks.com/en/query-federation/bigquery.html

