<?xml version="1.0" encoding="UTF-8"?>
<rss xmlns:content="http://purl.org/rss/1.0/modules/content/" xmlns:dc="http://purl.org/dc/elements/1.1/" xmlns:rdf="http://www.w3.org/1999/02/22-rdf-syntax-ns#" xmlns:taxo="http://purl.org/rss/1.0/modules/taxonomy/" version="2.0">
  <channel>
    <title>topic Re: Unity Catalog Error: PERMISSION_DENIED: Can not move tables across arclight catalogs (Free Editi in Databricks Free Edition Help</title>
    <link>https://community.databricks.com/t5/databricks-free-edition-help/unity-catalog-error-permission-denied-can-not-move-tables-across/m-p/149932#M695</link>
    <description>&lt;DIV&gt;&lt;SPAN&gt;The Code:&lt;BR /&gt;from&lt;/SPAN&gt;&lt;SPAN&gt; pyspark &lt;/SPAN&gt;&lt;SPAN&gt;import&lt;/SPAN&gt;&lt;SPAN&gt; pipelines &lt;/SPAN&gt;&lt;SPAN&gt;as&lt;/SPAN&gt;&lt;SPAN&gt; dp&lt;/SPAN&gt;&lt;/DIV&gt;&lt;DIV&gt;&lt;SPAN&gt;from&lt;/SPAN&gt;&lt;SPAN&gt; pyspark.sql &lt;/SPAN&gt;&lt;SPAN&gt;import&lt;/SPAN&gt;&lt;SPAN&gt; functions&lt;/SPAN&gt;&lt;/DIV&gt;&lt;DIV&gt;&lt;SPAN&gt;from&lt;/SPAN&gt;&lt;SPAN&gt; pyspark.sql.functions &lt;/SPAN&gt;&lt;SPAN&gt;import&lt;/SPAN&gt;&lt;SPAN&gt; current_date&lt;/SPAN&gt;&lt;/DIV&gt;&lt;DIV&gt;&lt;SPAN&gt;from&lt;/SPAN&gt;&lt;SPAN&gt; pyspark.sql.types &lt;/SPAN&gt;&lt;SPAN&gt;import&lt;/SPAN&gt;&lt;SPAN&gt; StructType, StructField, StringType, IntegerType, FloatType, BooleanType, ArrayType&lt;/SPAN&gt;&lt;/DIV&gt;&lt;P&gt;&amp;nbsp;&lt;/P&gt;&lt;DIV&gt;&lt;SPAN&gt;@&lt;/SPAN&gt;&lt;SPAN&gt;dp&lt;/SPAN&gt;&lt;SPAN&gt;.&lt;/SPAN&gt;&lt;SPAN&gt;table&lt;/SPAN&gt;&lt;SPAN&gt;()&lt;/SPAN&gt;&lt;/DIV&gt;&lt;DIV&gt;&lt;SPAN&gt;def&lt;/SPAN&gt; &lt;SPAN&gt;ingest&lt;/SPAN&gt;&lt;SPAN&gt;():&lt;/SPAN&gt;&lt;/DIV&gt;&lt;DIV&gt;&lt;SPAN&gt;&amp;nbsp; df &lt;/SPAN&gt;&lt;SPAN&gt;=&lt;/SPAN&gt;&lt;SPAN&gt; spark.read.&lt;/SPAN&gt;&lt;SPAN&gt;table&lt;/SPAN&gt;&lt;SPAN&gt;(&lt;/SPAN&gt;&lt;SPAN&gt;'stream.stream_learning.states_stream'&lt;/SPAN&gt;&lt;SPAN&gt;)&lt;/SPAN&gt;&lt;/DIV&gt;&lt;DIV&gt;&lt;SPAN&gt;&amp;nbsp; df &lt;/SPAN&gt;&lt;SPAN&gt;=&lt;/SPAN&gt;&lt;SPAN&gt; df.&lt;/SPAN&gt;&lt;SPAN&gt;withColumn&lt;/SPAN&gt;&lt;SPAN&gt;(&lt;/SPAN&gt;&lt;SPAN&gt;'processado'&lt;/SPAN&gt;&lt;SPAN&gt;,&lt;/SPAN&gt;&lt;SPAN&gt;current_date&lt;/SPAN&gt;&lt;SPAN&gt;())&lt;/SPAN&gt;&lt;/DIV&gt;&lt;P&gt;&amp;nbsp;&lt;/P&gt;&lt;DIV&gt;&lt;SPAN&gt;&amp;nbsp; &lt;/SPAN&gt;&lt;SPAN&gt;return&lt;/SPAN&gt;&lt;SPAN&gt; df&lt;BR /&gt;------------------------------------------------------------------------------------------------------------------------------&lt;BR /&gt;&lt;/SPAN&gt;The error:&lt;BR /&gt;&lt;P&gt;Category: Error&lt;BR /&gt;Message: Encountered an error with Unity Catalog while setting up the pipeline on cluster 0225-015320-jpn6b927-v2n.&lt;BR /&gt;Ensure that your Unity Catalog configuration is correct, and that required resources (e.g., catalog, schema) exist and are accessible.&lt;BR /&gt;Also verify that the cluster has appropriate permissions to access Unity Catalog.&lt;/P&gt;&lt;P&gt;Details: PERMISSION_DENIED: Can not move tables across arclight catalogs&lt;BR /&gt;Error class: UNITY_CATALOG_INITIALIZATION_FAILED&lt;BR /&gt;SQL state: 56000&lt;/P&gt;---------------------------------------------------------------------------------------------------------&lt;/DIV&gt;</description>
    <pubDate>Thu, 05 Mar 2026 18:31:39 GMT</pubDate>
    <dc:creator>Schusmeicer</dc:creator>
    <dc:date>2026-03-05T18:31:39Z</dc:date>
    <item>
      <title>Unity Catalog Error: PERMISSION_DENIED: Can not move tables across arclight catalogs (Free Edition)</title>
      <link>https://community.databricks.com/t5/databricks-free-edition-help/unity-catalog-error-permission-denied-can-not-move-tables-across/m-p/149229#M687</link>
      <description>&lt;P&gt;Subject: Unity Catalog Error: PERMISSION_DENIED: Can not move tables across arclight catalogs (Free Edition)&lt;BR /&gt;Body: Hi everyone,&lt;/P&gt;&lt;P&gt;I'm trying to set up a Spark Declarative Pipeline (SDP) using a streaming table on Databricks Free Edition, but I'm hitting a persistent initialization error during the pipeline setup.&lt;/P&gt;&lt;P&gt;The Error: UNITY_CATALOG_INITIALIZATION_FAILED: PERMISSION_DENIED: Can not move tables across arclight catalogs. SQL state: 56000&lt;/P&gt;&lt;P&gt;Context &amp;amp; Setup:&lt;/P&gt;&lt;P&gt;Environment: Databricks Free Edition (Community/Free tier).&lt;/P&gt;&lt;P&gt;Source Table: stream.stream_learning.source_data_stream (Streaming Table)&lt;/P&gt;&lt;P&gt;Target Table: stream.stream_learning.processed_data_ingest (Defined via SDP function decorator)&lt;/P&gt;&lt;P&gt;Cluster: 0225-015320-jpn6b927-v2n.&lt;/P&gt;&lt;P&gt;Both the source and the destination are within the same Catalog and Schema (stream.stream_learning).&lt;/P&gt;&lt;P&gt;It seems like the internal process that moves the data from the temporary/staging area to the final Unity Catalog destination is being flagged as a "cross-catalog move," even though everything is logically in the same namespace.&lt;/P&gt;&lt;P&gt;Has anyone encountered this "arclight catalogs" restriction on the Free Tier? Is there a specific configuration required for SDP when both source and sink are in Unity Catalog, or is this a known limitation of the Free Edition's UC implementation?&lt;/P&gt;&lt;P&gt;Any insights would be greatly appreciated!&lt;/P&gt;</description>
      <pubDate>Wed, 25 Feb 2026 02:11:03 GMT</pubDate>
      <guid>https://community.databricks.com/t5/databricks-free-edition-help/unity-catalog-error-permission-denied-can-not-move-tables-across/m-p/149229#M687</guid>
      <dc:creator>Schusmeicer</dc:creator>
      <dc:date>2026-02-25T02:11:03Z</dc:date>
    </item>
    <item>
      <title>Re: Unity Catalog Error: PERMISSION_DENIED: Can not move tables across arclight catalogs (Free Editi</title>
      <link>https://community.databricks.com/t5/databricks-free-edition-help/unity-catalog-error-permission-denied-can-not-move-tables-across/m-p/149650#M692</link>
      <description>&lt;P&gt;Could you share the code you are using so I can try to reproduce?&lt;/P&gt;</description>
      <pubDate>Tue, 03 Mar 2026 02:46:29 GMT</pubDate>
      <guid>https://community.databricks.com/t5/databricks-free-edition-help/unity-catalog-error-permission-denied-can-not-move-tables-across/m-p/149650#M692</guid>
      <dc:creator>MoJaMa</dc:creator>
      <dc:date>2026-03-03T02:46:29Z</dc:date>
    </item>
    <item>
      <title>Re: Unity Catalog Error: PERMISSION_DENIED: Can not move tables across arclight catalogs (Free Editi</title>
      <link>https://community.databricks.com/t5/databricks-free-edition-help/unity-catalog-error-permission-denied-can-not-move-tables-across/m-p/149932#M695</link>
      <description>&lt;DIV&gt;&lt;SPAN&gt;The Code:&lt;BR /&gt;from&lt;/SPAN&gt;&lt;SPAN&gt; pyspark &lt;/SPAN&gt;&lt;SPAN&gt;import&lt;/SPAN&gt;&lt;SPAN&gt; pipelines &lt;/SPAN&gt;&lt;SPAN&gt;as&lt;/SPAN&gt;&lt;SPAN&gt; dp&lt;/SPAN&gt;&lt;/DIV&gt;&lt;DIV&gt;&lt;SPAN&gt;from&lt;/SPAN&gt;&lt;SPAN&gt; pyspark.sql &lt;/SPAN&gt;&lt;SPAN&gt;import&lt;/SPAN&gt;&lt;SPAN&gt; functions&lt;/SPAN&gt;&lt;/DIV&gt;&lt;DIV&gt;&lt;SPAN&gt;from&lt;/SPAN&gt;&lt;SPAN&gt; pyspark.sql.functions &lt;/SPAN&gt;&lt;SPAN&gt;import&lt;/SPAN&gt;&lt;SPAN&gt; current_date&lt;/SPAN&gt;&lt;/DIV&gt;&lt;DIV&gt;&lt;SPAN&gt;from&lt;/SPAN&gt;&lt;SPAN&gt; pyspark.sql.types &lt;/SPAN&gt;&lt;SPAN&gt;import&lt;/SPAN&gt;&lt;SPAN&gt; StructType, StructField, StringType, IntegerType, FloatType, BooleanType, ArrayType&lt;/SPAN&gt;&lt;/DIV&gt;&lt;P&gt;&amp;nbsp;&lt;/P&gt;&lt;DIV&gt;&lt;SPAN&gt;@&lt;/SPAN&gt;&lt;SPAN&gt;dp&lt;/SPAN&gt;&lt;SPAN&gt;.&lt;/SPAN&gt;&lt;SPAN&gt;table&lt;/SPAN&gt;&lt;SPAN&gt;()&lt;/SPAN&gt;&lt;/DIV&gt;&lt;DIV&gt;&lt;SPAN&gt;def&lt;/SPAN&gt; &lt;SPAN&gt;ingest&lt;/SPAN&gt;&lt;SPAN&gt;():&lt;/SPAN&gt;&lt;/DIV&gt;&lt;DIV&gt;&lt;SPAN&gt;&amp;nbsp; df &lt;/SPAN&gt;&lt;SPAN&gt;=&lt;/SPAN&gt;&lt;SPAN&gt; spark.read.&lt;/SPAN&gt;&lt;SPAN&gt;table&lt;/SPAN&gt;&lt;SPAN&gt;(&lt;/SPAN&gt;&lt;SPAN&gt;'stream.stream_learning.states_stream'&lt;/SPAN&gt;&lt;SPAN&gt;)&lt;/SPAN&gt;&lt;/DIV&gt;&lt;DIV&gt;&lt;SPAN&gt;&amp;nbsp; df &lt;/SPAN&gt;&lt;SPAN&gt;=&lt;/SPAN&gt;&lt;SPAN&gt; df.&lt;/SPAN&gt;&lt;SPAN&gt;withColumn&lt;/SPAN&gt;&lt;SPAN&gt;(&lt;/SPAN&gt;&lt;SPAN&gt;'processado'&lt;/SPAN&gt;&lt;SPAN&gt;,&lt;/SPAN&gt;&lt;SPAN&gt;current_date&lt;/SPAN&gt;&lt;SPAN&gt;())&lt;/SPAN&gt;&lt;/DIV&gt;&lt;P&gt;&amp;nbsp;&lt;/P&gt;&lt;DIV&gt;&lt;SPAN&gt;&amp;nbsp; &lt;/SPAN&gt;&lt;SPAN&gt;return&lt;/SPAN&gt;&lt;SPAN&gt; df&lt;BR /&gt;------------------------------------------------------------------------------------------------------------------------------&lt;BR /&gt;&lt;/SPAN&gt;The error:&lt;BR /&gt;&lt;P&gt;Category: Error&lt;BR /&gt;Message: Encountered an error with Unity Catalog while setting up the pipeline on cluster 0225-015320-jpn6b927-v2n.&lt;BR /&gt;Ensure that your Unity Catalog configuration is correct, and that required resources (e.g., catalog, schema) exist and are accessible.&lt;BR /&gt;Also verify that the cluster has appropriate permissions to access Unity Catalog.&lt;/P&gt;&lt;P&gt;Details: PERMISSION_DENIED: Can not move tables across arclight catalogs&lt;BR /&gt;Error class: UNITY_CATALOG_INITIALIZATION_FAILED&lt;BR /&gt;SQL state: 56000&lt;/P&gt;---------------------------------------------------------------------------------------------------------&lt;/DIV&gt;</description>
      <pubDate>Thu, 05 Mar 2026 18:31:39 GMT</pubDate>
      <guid>https://community.databricks.com/t5/databricks-free-edition-help/unity-catalog-error-permission-denied-can-not-move-tables-across/m-p/149932#M695</guid>
      <dc:creator>Schusmeicer</dc:creator>
      <dc:date>2026-03-05T18:31:39Z</dc:date>
    </item>
    <item>
      <title>Re: Unity Catalog Error: PERMISSION_DENIED: Can not move tables across arclight catalogs (Free Editi</title>
      <link>https://community.databricks.com/t5/databricks-free-edition-help/unity-catalog-error-permission-denied-can-not-move-tables-across/m-p/150206#M699</link>
      <description>&lt;P&gt;Hi &lt;a href="https://community.databricks.com/t5/user/viewprofilepage/user-id/189838"&gt;@Schusmeicer&lt;/a&gt;,&lt;/P&gt;
&lt;P&gt;The "Can not move tables across arclight catalogs" error you are seeing is specific to how Unity Catalog is managed on the Databricks Free Edition. "Arclight" is the internal infrastructure name for the Free Edition's managed Unity Catalog environment, and it enforces certain restrictions on how tables and pipeline artifacts can be created and moved within that environment.&lt;/P&gt;
&lt;P&gt;When a Lakeflow Spark Declarative Pipeline (SDP) runs, it internally creates temporary staging tables and then moves them into the target catalog and schema. On the Free Edition, this internal move operation can be blocked because the managed catalog infrastructure treats the staging area and your target catalog as separate "arclight catalogs," even though from your perspective everything is in the same catalog and schema.&lt;/P&gt;
&lt;P&gt;Here are a few things to check and try:&lt;/P&gt;
&lt;P&gt;&lt;BR /&gt;VERIFY YOUR PIPELINE CONFIGURATION&lt;/P&gt;
&lt;P&gt;Make sure your pipeline's default catalog and schema are explicitly set to match where your source and target tables live:&lt;/P&gt;
&lt;P&gt;1. Open your pipeline in the Databricks workspace UI.&lt;BR /&gt;2. Under the pipeline settings, confirm that the "Default catalog" is set to "stream" and the "Default schema" is set to "stream_learning".&lt;BR /&gt;3. Make sure the pipeline was created with Unity Catalog mode (not Hive metastore).&lt;/P&gt;
&lt;P&gt;This ensures the pipeline's internal staging operations happen in the correct catalog context.&lt;/P&gt;
&lt;P&gt;&lt;BR /&gt;USE A FULLY QUALIFIED TABLE NAME IN THE DECORATOR&lt;/P&gt;
&lt;P&gt;Instead of relying on the default catalog/schema resolution, try specifying the full name in your &lt;a href="https://community.databricks.com/t5/user/viewprofilepage/user-id/25059"&gt;@DP&lt;/a&gt;.table() decorator:&lt;/P&gt;
&lt;P&gt;from pyspark import pipelines as dp&lt;BR /&gt;from pyspark.sql.functions import current_date&lt;/P&gt;
&lt;P&gt;&lt;a href="https://community.databricks.com/t5/user/viewprofilepage/user-id/25059"&gt;@DP&lt;/a&gt;.table(name="stream.stream_learning.ingest")&lt;BR /&gt;def ingest():&lt;BR /&gt;df = spark.read.table("stream.stream_learning.states_stream")&lt;BR /&gt;df = df.withColumn("processado", current_date())&lt;BR /&gt;return df&lt;/P&gt;
&lt;P&gt;This can help the pipeline engine resolve the correct destination without ambiguity during the internal staging process.&lt;/P&gt;
&lt;P&gt;&lt;BR /&gt;CONFIRM YOU ARE NOT EXCEEDING THE ONE-PIPELINE LIMIT&lt;/P&gt;
&lt;P&gt;The Free Edition allows only one active pipeline per pipeline type. If you have another SDP pipeline that is already active (even in a failed or initializing state), that could cause conflicts. Go to the Pipelines section of your workspace, stop or delete any other active pipelines, and then retry.&lt;/P&gt;
&lt;P&gt;&lt;BR /&gt;TRY RECREATING THE PIPELINE FROM SCRATCH&lt;/P&gt;
&lt;P&gt;If the above steps do not resolve it, try deleting the current pipeline entirely and creating a new one:&lt;/P&gt;
&lt;P&gt;1. Delete the existing pipeline from the Pipelines UI.&lt;BR /&gt;2. Create a new pipeline.&lt;BR /&gt;3. Set the default catalog to "stream" and default schema to "stream_learning".&lt;BR /&gt;4. Attach your notebook with the SDP code.&lt;BR /&gt;5. Run the pipeline.&lt;/P&gt;
&lt;P&gt;This can help clear any stale internal state that may be contributing to the cross-catalog error.&lt;/P&gt;
&lt;P&gt;&lt;BR /&gt;CHECK FOR EXISTING TABLE CONFLICTS&lt;/P&gt;
&lt;P&gt;If a table named "ingest" (or whatever your &lt;a href="https://community.databricks.com/t5/user/viewprofilepage/user-id/25059"&gt;@DP&lt;/a&gt;.table function name resolves to) already exists in the target schema and was not originally created by a pipeline, that can also trigger this type of error. You can check by running:&lt;/P&gt;
&lt;P&gt;SHOW TABLES IN stream.stream_learning;&lt;/P&gt;
&lt;P&gt;If a conflicting table exists, try either dropping it first or using a different name in your &lt;a href="https://community.databricks.com/t5/user/viewprofilepage/user-id/25059"&gt;@DP&lt;/a&gt;.table() decorator.&lt;/P&gt;
&lt;P&gt;&lt;BR /&gt;ADDITIONAL NOTES&lt;/P&gt;
&lt;P&gt;The Free Edition documentation lists some constraints relevant to pipelines:&lt;BR /&gt;&lt;A href="https://docs.databricks.com/en/getting-started/free-edition-limitations.html" target="_blank"&gt;https://docs.databricks.com/en/getting-started/free-edition-limitations.html&lt;/A&gt;&lt;/P&gt;
&lt;P&gt;For more on configuring Lakeflow Spark Declarative Pipelines with Unity Catalog:&lt;BR /&gt;&lt;A href="https://docs.databricks.com/en/delta-live-tables/unity-catalog.html" target="_blank"&gt;https://docs.databricks.com/en/delta-live-tables/unity-catalog.html&lt;/A&gt;&lt;BR /&gt;&lt;A href="https://docs.databricks.com/en/delta-live-tables/configure-pipeline.html" target="_blank"&gt;https://docs.databricks.com/en/delta-live-tables/configure-pipeline.html&lt;/A&gt;&lt;/P&gt;
&lt;P&gt;If none of these steps resolve the issue, it may be worth opening a support request through the Help Center, as the error could relate to an internal state issue with your Free Edition workspace that requires backend attention.&lt;/P&gt;
&lt;P&gt;* This reply used an agent system I built to research and draft this response based on the wide set of documentation I have available and previous memory. I personally review the draft for any obvious issues and for monitoring system reliability and update it when I detect any drift, but there is still a small chance that something is inaccurate, especially if you are experimenting with brand new features.&lt;/P&gt;</description>
      <pubDate>Sun, 08 Mar 2026 07:41:56 GMT</pubDate>
      <guid>https://community.databricks.com/t5/databricks-free-edition-help/unity-catalog-error-permission-denied-can-not-move-tables-across/m-p/150206#M699</guid>
      <dc:creator>SteveOstrowski</dc:creator>
      <dc:date>2026-03-08T07:41:56Z</dc:date>
    </item>
  </channel>
</rss>

