<?xml version="1.0" encoding="UTF-8"?>
<rss xmlns:content="http://purl.org/rss/1.0/modules/content/" xmlns:dc="http://purl.org/dc/elements/1.1/" xmlns:rdf="http://www.w3.org/1999/02/22-rdf-syntax-ns#" xmlns:taxo="http://purl.org/rss/1.0/modules/taxonomy/" version="2.0">
  <channel>
    <title>topic Is there a way to natively mount external Iceberg REST Catalogs (e.g., BigLake) in Unity Catalog? in Data Engineering</title>
    <link>https://community.databricks.com/t5/data-engineering/is-there-a-way-to-natively-mount-external-iceberg-rest-catalogs/m-p/155669#M54292</link>
    <description>&lt;P&gt;Hi everyone,&lt;/P&gt;&lt;P&gt;I have been reviewing the documentation on integrating external Iceberg tables with Databricks. Currently, the only method I have found to read from an Iceberg REST catalog (specifically GCP BigLake in my case) is by explicitly passing the catalog configurations directly to the SparkSession.&lt;/P&gt;&lt;P&gt;Here is the approach I am currently using, which works successfully on standard clusters:&lt;/P&gt;&lt;LI-CODE lang="python"&gt;import pyspark
from pyspark.context import SparkContext
from pyspark.sql import SparkSession

# Define specific variables
spark_catalog = "spark_catalog"
gcp_project_id = "mul-dev-databricks"
warehouse_path = f"bq://projects/{gcp_project_id}/locations/us"
gcp_scopes = "https://www.googleapis.com/auth/cloud-platform"

# Build the SparkSession with REST catalog configs
spark = SparkSession.builder.appName("BigLake_Iceberg_App") \
    .config("spark.sql.extensions", "org.apache.iceberg.spark.extensions.IcebergSparkSessionExtensions") \
    .config('spark.sql.catalog.spark_catalog', 'org.apache.iceberg.spark.SparkCatalog') \
    .config('spark.sql.catalog.spark_catalog.type', 'rest') \
    .config('spark.sql.catalog.spark_catalog.uri', 'https://biglake.googleapis.com/iceberg/v1/restcatalog') \
    .config('spark.sql.catalog.spark_catalog.warehouse', warehouse_path) \
    .config('spark.sql.catalog.spark_catalog.header.x-goog-user-project', gcp_project_id) \
    .config("spark.sql.catalog.spark_catalog.gcp.auth.scopes", gcp_scopes) \
    .config('spark.sql.catalog.spark_catalog.header.X-Iceberg-Access-Delegation', 'vended-credentials') \
    .config('spark.sql.catalog.spark_catalog.rest.auth.type', 'org.apache.iceberg.gcp.auth.GoogleAuthManager') \
    .config('spark.sql.catalog.spark_catalog.io-impl', 'org.apache.iceberg.gcp.gcs.GCSFileIO') \
    .config('spark.sql.catalog.spark_catalog.rest-metrics-reporting-enabled', 'false') \
    .config('spark.sql.defaultCatalog', 'spark_catalog') \
    .getOrCreate()&lt;/LI-CODE&gt;&lt;P&gt;While I can successfully access the data this way, it is not an ideal long-term solution. It would be much more powerful if &lt;STRONG&gt;Unity Catalog could mount these REST catalogs directly as external catalogs&lt;/STRONG&gt; natively (similar to Unity Catalog Federation).&lt;/P&gt;&lt;P&gt;Relying on Spark configurations introduces a few key challenges:&lt;/P&gt;&lt;OL&gt;&lt;LI&gt;&lt;P&gt;&lt;STRONG&gt;Serverless Limitations:&lt;/STRONG&gt; Serverless compute has strict limitations regarding custom Spark configurations. Setting these at the cluster level completely blocks us from utilizing Serverless SQL/Compute for this data.&lt;/P&gt;&lt;/LI&gt;&lt;LI&gt;&lt;P&gt;&lt;STRONG&gt;Governance:&lt;/STRONG&gt; Managing connections via code or cluster configs bypasses the centralized governance and access control benefits of Unity Catalog.&lt;/P&gt;&lt;/LI&gt;&lt;LI&gt;&lt;P&gt;&lt;STRONG&gt;User Experience:&lt;/STRONG&gt; It requires every user or job to duplicate these boilerplate configurations.&lt;/P&gt;&lt;/LI&gt;&lt;/OL&gt;&lt;P&gt;&lt;STRONG&gt;My questions to the community:&lt;/STRONG&gt;&lt;/P&gt;&lt;UL&gt;&lt;LI&gt;&lt;P&gt;Is there currently a supported way to configure an Iceberg REST Catalog connection directly inside Unity Catalog as a Foreign/External Catalog?&lt;/P&gt;&lt;/LI&gt;&lt;LI&gt;&lt;P&gt;If not, is there an alternative workaround that plays nicely with Databricks Serverless?&lt;/P&gt;&lt;/LI&gt;&lt;LI&gt;&lt;P&gt;Is native Unity Catalog federation for Iceberg REST catalogs on the Databricks roadmap?&lt;/P&gt;&lt;/LI&gt;&lt;/UL&gt;&lt;P&gt;Thanks in advance for any insights!&lt;/P&gt;</description>
    <pubDate>Tue, 28 Apr 2026 10:53:52 GMT</pubDate>
    <dc:creator>ismaelhenzel</dc:creator>
    <dc:date>2026-04-28T10:53:52Z</dc:date>
    <item>
      <title>Is there a way to natively mount external Iceberg REST Catalogs (e.g., BigLake) in Unity Catalog?</title>
      <link>https://community.databricks.com/t5/data-engineering/is-there-a-way-to-natively-mount-external-iceberg-rest-catalogs/m-p/155669#M54292</link>
      <description>&lt;P&gt;Hi everyone,&lt;/P&gt;&lt;P&gt;I have been reviewing the documentation on integrating external Iceberg tables with Databricks. Currently, the only method I have found to read from an Iceberg REST catalog (specifically GCP BigLake in my case) is by explicitly passing the catalog configurations directly to the SparkSession.&lt;/P&gt;&lt;P&gt;Here is the approach I am currently using, which works successfully on standard clusters:&lt;/P&gt;&lt;LI-CODE lang="python"&gt;import pyspark
from pyspark.context import SparkContext
from pyspark.sql import SparkSession

# Define specific variables
spark_catalog = "spark_catalog"
gcp_project_id = "mul-dev-databricks"
warehouse_path = f"bq://projects/{gcp_project_id}/locations/us"
gcp_scopes = "https://www.googleapis.com/auth/cloud-platform"

# Build the SparkSession with REST catalog configs
spark = SparkSession.builder.appName("BigLake_Iceberg_App") \
    .config("spark.sql.extensions", "org.apache.iceberg.spark.extensions.IcebergSparkSessionExtensions") \
    .config('spark.sql.catalog.spark_catalog', 'org.apache.iceberg.spark.SparkCatalog') \
    .config('spark.sql.catalog.spark_catalog.type', 'rest') \
    .config('spark.sql.catalog.spark_catalog.uri', 'https://biglake.googleapis.com/iceberg/v1/restcatalog') \
    .config('spark.sql.catalog.spark_catalog.warehouse', warehouse_path) \
    .config('spark.sql.catalog.spark_catalog.header.x-goog-user-project', gcp_project_id) \
    .config("spark.sql.catalog.spark_catalog.gcp.auth.scopes", gcp_scopes) \
    .config('spark.sql.catalog.spark_catalog.header.X-Iceberg-Access-Delegation', 'vended-credentials') \
    .config('spark.sql.catalog.spark_catalog.rest.auth.type', 'org.apache.iceberg.gcp.auth.GoogleAuthManager') \
    .config('spark.sql.catalog.spark_catalog.io-impl', 'org.apache.iceberg.gcp.gcs.GCSFileIO') \
    .config('spark.sql.catalog.spark_catalog.rest-metrics-reporting-enabled', 'false') \
    .config('spark.sql.defaultCatalog', 'spark_catalog') \
    .getOrCreate()&lt;/LI-CODE&gt;&lt;P&gt;While I can successfully access the data this way, it is not an ideal long-term solution. It would be much more powerful if &lt;STRONG&gt;Unity Catalog could mount these REST catalogs directly as external catalogs&lt;/STRONG&gt; natively (similar to Unity Catalog Federation).&lt;/P&gt;&lt;P&gt;Relying on Spark configurations introduces a few key challenges:&lt;/P&gt;&lt;OL&gt;&lt;LI&gt;&lt;P&gt;&lt;STRONG&gt;Serverless Limitations:&lt;/STRONG&gt; Serverless compute has strict limitations regarding custom Spark configurations. Setting these at the cluster level completely blocks us from utilizing Serverless SQL/Compute for this data.&lt;/P&gt;&lt;/LI&gt;&lt;LI&gt;&lt;P&gt;&lt;STRONG&gt;Governance:&lt;/STRONG&gt; Managing connections via code or cluster configs bypasses the centralized governance and access control benefits of Unity Catalog.&lt;/P&gt;&lt;/LI&gt;&lt;LI&gt;&lt;P&gt;&lt;STRONG&gt;User Experience:&lt;/STRONG&gt; It requires every user or job to duplicate these boilerplate configurations.&lt;/P&gt;&lt;/LI&gt;&lt;/OL&gt;&lt;P&gt;&lt;STRONG&gt;My questions to the community:&lt;/STRONG&gt;&lt;/P&gt;&lt;UL&gt;&lt;LI&gt;&lt;P&gt;Is there currently a supported way to configure an Iceberg REST Catalog connection directly inside Unity Catalog as a Foreign/External Catalog?&lt;/P&gt;&lt;/LI&gt;&lt;LI&gt;&lt;P&gt;If not, is there an alternative workaround that plays nicely with Databricks Serverless?&lt;/P&gt;&lt;/LI&gt;&lt;LI&gt;&lt;P&gt;Is native Unity Catalog federation for Iceberg REST catalogs on the Databricks roadmap?&lt;/P&gt;&lt;/LI&gt;&lt;/UL&gt;&lt;P&gt;Thanks in advance for any insights!&lt;/P&gt;</description>
      <pubDate>Tue, 28 Apr 2026 10:53:52 GMT</pubDate>
      <guid>https://community.databricks.com/t5/data-engineering/is-there-a-way-to-natively-mount-external-iceberg-rest-catalogs/m-p/155669#M54292</guid>
      <dc:creator>ismaelhenzel</dc:creator>
      <dc:date>2026-04-28T10:53:52Z</dc:date>
    </item>
  </channel>
</rss>

