Hello @ismaelhenzel
AFAIK, there is no documented native UC foreign catalog integration for a generic Iceberg REST catalog such as BigLake REST Catalog today.
DBKS does support Iceberg in UC including UC managed and foreign Iceberg tables but the documented foreign Iceberg support is through Lakehouse Federation with supported external catalogs with examples such as AWS Glue, Hive metastore or Snowflake Horizon Catalog.
BigLake REST Catalog is not listed as a native UC catalog federation target. https://docs.databricks.com/gcp/en/iceberg/
For GCP, the closest option is Lakehouse Federation to BigQuery where you create a UC connection and a foreign catalog of type bigquery and this works with SQL warehouses also serverless SQL warehouses and gives you UC level discovery and access control.
CREATE CONNECTION my_bq_connection TYPE bigquery
OPTIONS (
GoogleServiceAccountKeyJson secret('scope', 'bq-service-account-json')
);
CREATE FOREIGN CATALOG my_bq_catalog
USING CONNECTION my_bq_connection
OPTIONS (
dataProjectId 'my-gcp-project'
);However it is not a native Iceberg REST catalog mount. If your BigLake or Iceberg data is exposed through BigQuery as external tables or views this may be a practical workaround but don't forget that views and external tables are materialized by the BigQuery connector, so behavior (also with perf or cost) may differ from reading the Iceberg REST catalog directly.
For the pure Iceberg REST approach you showed, I think you are currently limited to classic compute where you can control the Spark catalog configuration. Serverless has restrictions around external data access, Maven data sources, compute scoped libraries and most Spark compute configs so it is not a good fit for this kind of custom Spark catalog setup.
If this answer resolves your question, could you please mark it as “Accept as Solution”? It will help other users quickly find the correct fix.
Senior BI/Data Engineer | Microsoft MVP Data Platform | Microsoft MVP Power BI | Power BI Super User | C# Corner MVP