<?xml version="1.0" encoding="UTF-8"?>
<rss xmlns:content="http://purl.org/rss/1.0/modules/content/" xmlns:dc="http://purl.org/dc/elements/1.1/" xmlns:rdf="http://www.w3.org/1999/02/22-rdf-syntax-ns#" xmlns:taxo="http://purl.org/rss/1.0/modules/taxonomy/" version="2.0">
  <channel>
    <title>topic Re: Is there a way to natively mount external Iceberg REST Catalogs (e.g., BigLake) in Unity Catalog in Data Engineering</title>
    <link>https://community.databricks.com/t5/data-engineering/is-there-a-way-to-natively-mount-external-iceberg-rest-catalogs/m-p/155734#M54302</link>
    <description>&lt;P&gt;Hello&amp;nbsp;&lt;a href="https://community.databricks.com/t5/user/viewprofilepage/user-id/103370"&gt;@ismaelhenzel&lt;/a&gt;&amp;nbsp;&lt;/P&gt;&lt;P&gt;AFAIK, there is no documented native UC foreign catalog integration for a generic Iceberg REST catalog such as BigLake REST Catalog today.&lt;/P&gt;&lt;P&gt;DBKS does support Iceberg in UC including UC&amp;nbsp;managed and&amp;nbsp;foreign Iceberg tables&amp;nbsp;but the documented foreign Iceberg support is through Lakehouse Federation with supported external catalogs with examples such as AWS Glue, Hive metastore or Snowflake Horizon Catalog.&lt;/P&gt;&lt;P&gt;BigLake REST Catalog is not listed as a native UC catalog federation target. &lt;A title="What is Apache Iceberg in Databricks? | Databricks on Google Cloud" href="https://docs.databricks.com/gcp/en/iceberg/" target="_self"&gt;https://docs.databricks.com/gcp/en/iceberg/&lt;/A&gt;&lt;/P&gt;&lt;P&gt;For GCP, the closest option is Lakehouse Federation to BigQuery&amp;nbsp;where you create a UC connection and a foreign catalog of type bigquery and this works with SQL warehouses also serverless SQL warehouses and gives you UC level discovery and access control.&amp;nbsp;&lt;/P&gt;&lt;P&gt;&amp;nbsp;&lt;/P&gt;&lt;PRE&gt;CREATE CONNECTION my_bq_connection TYPE bigquery
OPTIONS (
  GoogleServiceAccountKeyJson secret('scope', 'bq-service-account-json')
);

CREATE FOREIGN CATALOG my_bq_catalog
USING CONNECTION my_bq_connection
OPTIONS (
  dataProjectId 'my-gcp-project'
);&lt;/PRE&gt;&lt;P&gt;However it is not a native Iceberg REST catalog mount. If your BigLake or Iceberg data is exposed through BigQuery as external tables or views this may be a practical workaround but don't forget that views and external tables are materialized by the BigQuery connector, so behavior (also with perf or cost) may differ from reading the Iceberg REST catalog directly.&amp;nbsp;&lt;/P&gt;&lt;P&gt;For the pure Iceberg REST approach you showed, I think you are currently limited to classic compute where you can control the Spark catalog configuration. Serverless has restrictions around external data access, Maven data sources, compute scoped libraries and most Spark compute configs so it is not a good fit for this kind of custom Spark catalog setup.&lt;/P&gt;&lt;HR /&gt;</description>
    <pubDate>Tue, 28 Apr 2026 18:57:25 GMT</pubDate>
    <dc:creator>amirabedhiafi</dc:creator>
    <dc:date>2026-04-28T18:57:25Z</dc:date>
    <item>
      <title>Is there a way to natively mount external Iceberg REST Catalogs (e.g., BigLake) in Unity Catalog?</title>
      <link>https://community.databricks.com/t5/data-engineering/is-there-a-way-to-natively-mount-external-iceberg-rest-catalogs/m-p/155669#M54292</link>
      <description>&lt;P&gt;Hi everyone,&lt;/P&gt;&lt;P&gt;I have been reviewing the documentation on integrating external Iceberg tables with Databricks. Currently, the only method I have found to read from an Iceberg REST catalog (specifically GCP BigLake in my case) is by explicitly passing the catalog configurations directly to the SparkSession.&lt;/P&gt;&lt;P&gt;Here is the approach I am currently using, which works successfully on standard clusters:&lt;/P&gt;&lt;LI-CODE lang="python"&gt;import pyspark
from pyspark.context import SparkContext
from pyspark.sql import SparkSession

# Define specific variables
spark_catalog = "spark_catalog"
gcp_project_id = "mul-dev-databricks"
warehouse_path = f"bq://projects/{gcp_project_id}/locations/us"
gcp_scopes = "https://www.googleapis.com/auth/cloud-platform"

# Build the SparkSession with REST catalog configs
spark = SparkSession.builder.appName("BigLake_Iceberg_App") \
    .config("spark.sql.extensions", "org.apache.iceberg.spark.extensions.IcebergSparkSessionExtensions") \
    .config('spark.sql.catalog.spark_catalog', 'org.apache.iceberg.spark.SparkCatalog') \
    .config('spark.sql.catalog.spark_catalog.type', 'rest') \
    .config('spark.sql.catalog.spark_catalog.uri', 'https://biglake.googleapis.com/iceberg/v1/restcatalog') \
    .config('spark.sql.catalog.spark_catalog.warehouse', warehouse_path) \
    .config('spark.sql.catalog.spark_catalog.header.x-goog-user-project', gcp_project_id) \
    .config("spark.sql.catalog.spark_catalog.gcp.auth.scopes", gcp_scopes) \
    .config('spark.sql.catalog.spark_catalog.header.X-Iceberg-Access-Delegation', 'vended-credentials') \
    .config('spark.sql.catalog.spark_catalog.rest.auth.type', 'org.apache.iceberg.gcp.auth.GoogleAuthManager') \
    .config('spark.sql.catalog.spark_catalog.io-impl', 'org.apache.iceberg.gcp.gcs.GCSFileIO') \
    .config('spark.sql.catalog.spark_catalog.rest-metrics-reporting-enabled', 'false') \
    .config('spark.sql.defaultCatalog', 'spark_catalog') \
    .getOrCreate()&lt;/LI-CODE&gt;&lt;P&gt;While I can successfully access the data this way, it is not an ideal long-term solution. It would be much more powerful if &lt;STRONG&gt;Unity Catalog could mount these REST catalogs directly as external catalogs&lt;/STRONG&gt; natively (similar to Unity Catalog Federation).&lt;/P&gt;&lt;P&gt;Relying on Spark configurations introduces a few key challenges:&lt;/P&gt;&lt;OL&gt;&lt;LI&gt;&lt;P&gt;&lt;STRONG&gt;Serverless Limitations:&lt;/STRONG&gt; Serverless compute has strict limitations regarding custom Spark configurations. Setting these at the cluster level completely blocks us from utilizing Serverless SQL/Compute for this data.&lt;/P&gt;&lt;/LI&gt;&lt;LI&gt;&lt;P&gt;&lt;STRONG&gt;Governance:&lt;/STRONG&gt; Managing connections via code or cluster configs bypasses the centralized governance and access control benefits of Unity Catalog.&lt;/P&gt;&lt;/LI&gt;&lt;LI&gt;&lt;P&gt;&lt;STRONG&gt;User Experience:&lt;/STRONG&gt; It requires every user or job to duplicate these boilerplate configurations.&lt;/P&gt;&lt;/LI&gt;&lt;/OL&gt;&lt;P&gt;&lt;STRONG&gt;My questions to the community:&lt;/STRONG&gt;&lt;/P&gt;&lt;UL&gt;&lt;LI&gt;&lt;P&gt;Is there currently a supported way to configure an Iceberg REST Catalog connection directly inside Unity Catalog as a Foreign/External Catalog?&lt;/P&gt;&lt;/LI&gt;&lt;LI&gt;&lt;P&gt;If not, is there an alternative workaround that plays nicely with Databricks Serverless?&lt;/P&gt;&lt;/LI&gt;&lt;LI&gt;&lt;P&gt;Is native Unity Catalog federation for Iceberg REST catalogs on the Databricks roadmap?&lt;/P&gt;&lt;/LI&gt;&lt;/UL&gt;&lt;P&gt;Thanks in advance for any insights!&lt;/P&gt;</description>
      <pubDate>Tue, 28 Apr 2026 10:53:52 GMT</pubDate>
      <guid>https://community.databricks.com/t5/data-engineering/is-there-a-way-to-natively-mount-external-iceberg-rest-catalogs/m-p/155669#M54292</guid>
      <dc:creator>ismaelhenzel</dc:creator>
      <dc:date>2026-04-28T10:53:52Z</dc:date>
    </item>
    <item>
      <title>Re: Is there a way to natively mount external Iceberg REST Catalogs (e.g., BigLake) in Unity Catalog</title>
      <link>https://community.databricks.com/t5/data-engineering/is-there-a-way-to-natively-mount-external-iceberg-rest-catalogs/m-p/155734#M54302</link>
      <description>&lt;P&gt;Hello&amp;nbsp;&lt;a href="https://community.databricks.com/t5/user/viewprofilepage/user-id/103370"&gt;@ismaelhenzel&lt;/a&gt;&amp;nbsp;&lt;/P&gt;&lt;P&gt;AFAIK, there is no documented native UC foreign catalog integration for a generic Iceberg REST catalog such as BigLake REST Catalog today.&lt;/P&gt;&lt;P&gt;DBKS does support Iceberg in UC including UC&amp;nbsp;managed and&amp;nbsp;foreign Iceberg tables&amp;nbsp;but the documented foreign Iceberg support is through Lakehouse Federation with supported external catalogs with examples such as AWS Glue, Hive metastore or Snowflake Horizon Catalog.&lt;/P&gt;&lt;P&gt;BigLake REST Catalog is not listed as a native UC catalog federation target. &lt;A title="What is Apache Iceberg in Databricks? | Databricks on Google Cloud" href="https://docs.databricks.com/gcp/en/iceberg/" target="_self"&gt;https://docs.databricks.com/gcp/en/iceberg/&lt;/A&gt;&lt;/P&gt;&lt;P&gt;For GCP, the closest option is Lakehouse Federation to BigQuery&amp;nbsp;where you create a UC connection and a foreign catalog of type bigquery and this works with SQL warehouses also serverless SQL warehouses and gives you UC level discovery and access control.&amp;nbsp;&lt;/P&gt;&lt;P&gt;&amp;nbsp;&lt;/P&gt;&lt;PRE&gt;CREATE CONNECTION my_bq_connection TYPE bigquery
OPTIONS (
  GoogleServiceAccountKeyJson secret('scope', 'bq-service-account-json')
);

CREATE FOREIGN CATALOG my_bq_catalog
USING CONNECTION my_bq_connection
OPTIONS (
  dataProjectId 'my-gcp-project'
);&lt;/PRE&gt;&lt;P&gt;However it is not a native Iceberg REST catalog mount. If your BigLake or Iceberg data is exposed through BigQuery as external tables or views this may be a practical workaround but don't forget that views and external tables are materialized by the BigQuery connector, so behavior (also with perf or cost) may differ from reading the Iceberg REST catalog directly.&amp;nbsp;&lt;/P&gt;&lt;P&gt;For the pure Iceberg REST approach you showed, I think you are currently limited to classic compute where you can control the Spark catalog configuration. Serverless has restrictions around external data access, Maven data sources, compute scoped libraries and most Spark compute configs so it is not a good fit for this kind of custom Spark catalog setup.&lt;/P&gt;&lt;HR /&gt;</description>
      <pubDate>Tue, 28 Apr 2026 18:57:25 GMT</pubDate>
      <guid>https://community.databricks.com/t5/data-engineering/is-there-a-way-to-natively-mount-external-iceberg-rest-catalogs/m-p/155734#M54302</guid>
      <dc:creator>amirabedhiafi</dc:creator>
      <dc:date>2026-04-28T18:57:25Z</dc:date>
    </item>
  </channel>
</rss>

