cancel
Showing results for 
Search instead for 
Did you mean: 
Data Engineering
Join discussions on data engineering best practices, architectures, and optimization strategies within the Databricks Community. Exchange insights and solutions with fellow data engineers.
cancel
Showing results for 
Search instead for 
Did you mean: 

Moving tables between pipelines in production

liquibricks
New Contributor II

We are testing an ingestion from kafka to databricks using a streaming table. The streaming table was created by a DAB deployed to "production" which runs as a service principal. This means the service principal is the "owner" of the table.

We now want to move that streaming table to a different pipeline without losing the data that has been saved in the existing streaming table. I see there is some documentation for that here: Move tables between Lakeflow Declarative Pipelines - Azure Databricks | Microsoft Learn

The problem is that I personally cannot execute the SQL to alter the streaming table (PERMISSION_DENIED: User is not an owner of Pipeline '....'.). Unfortunately it also seems like one cannot change the owner of an existing streaming table ([STREAMING_TABLE_OPERATION_NOT_ALLOWED.UNSUPPORTED_OPERATION] The operation SET OWNER is not allowed: The operation is not supported on Streaming Tables. SQLSTATE: 42601

Is there another way to alter the pipeline id for the existing table? I guess a workaround would be to run the ALTER TABLE script as part of the DAB so that it is executed as the service princpal?

3 REPLIES 3

nayan_wylde
Esteemed Contributor

You’ve hit two limitations:

Streaming tables don’t allow SET OWNER – ownership cannot be changed.
Lakeflow pipeline ID changes require pipeline-level permissions – if you’re not the pipeline owner, you can’t run ALTER STREAMING TABLE ... SET PIPELINE_ID.


Practical Workarounds
Option 1: Execute the ALTER as the Service Principal

Yes, your idea works: include the ALTER STREAMING TABLE ... SET PIPELINE_ID '<new_pipeline_id>' in the DAB for the new pipeline.
When the DAB runs under the service principal (which owns the table), the command succeeds.
This is the recommended approach per Databricks docs.

Option 2: Ask for temporary pipeline ownership

If possible, have the pipeline owner grant you Pipeline Owner permission so you can run the SQL manually.
After migration, revoke the permission.

Option 3: Clone the table (last resort)

If ownership cannot be changed and you cannot run as the service principal, you’d need to create a new streaming table and backfill from the old one.
This is more complex and usually unnecessary if Option 1 is possible.

Thanks for your reply, Nayan.

I'll try option 1 and report back later.

Is option 2 possible? As per the second error message, it does not appear as if we can change the owner on streaming tables.

Option 3 would be too complex and time-consuming, but thanks for the idea.

Oh, in the roadmap webinar today they just announced upcoming SQL support for ST/MV change owner. Maybe i'll wait for that then! 😅