Unity Catalog Error: PERMISSION_DENIED: Can not move tables across arclight catalogs (Free Edition)

Schusmeicer
New Contributor II

Subject: Unity Catalog Error: PERMISSION_DENIED: Can not move tables across arclight catalogs (Free Edition)
Body: Hi everyone,

I'm trying to set up a Spark Declarative Pipeline (SDP) using a streaming table on Databricks Free Edition, but I'm hitting a persistent initialization error during the pipeline setup.

The Error: UNITY_CATALOG_INITIALIZATION_FAILED: PERMISSION_DENIED: Can not move tables across arclight catalogs. SQL state: 56000

Context & Setup:

Environment: Databricks Free Edition (Community/Free tier).

Source Table: stream.stream_learning.source_data_stream (Streaming Table)

Target Table: stream.stream_learning.processed_data_ingest (Defined via SDP function decorator)

Cluster: 0225-015320-jpn6b927-v2n.

Both the source and the destination are within the same Catalog and Schema (stream.stream_learning).

It seems like the internal process that moves the data from the temporary/staging area to the final Unity Catalog destination is being flagged as a "cross-catalog move," even though everything is logically in the same namespace.

Has anyone encountered this "arclight catalogs" restriction on the Free Tier? Is there a specific configuration required for SDP when both source and sink are in Unity Catalog, or is this a known limitation of the Free Edition's UC implementation?

Any insights would be greatly appreciated!

Data Analyst | Python, PySpark & AWS | MBA em Data Science (USP/ Esalq) | Databricks & Infraestrutura de Dados