cancel
Showing results for 
Search instead for 
Did you mean: 
Data Engineering
Join discussions on data engineering best practices, architectures, and optimization strategies within the Databricks Community. Exchange insights and solutions with fellow data engineers.
cancel
Showing results for 
Search instead for 
Did you mean: 

Error updating schema: SCHEMA_FOREIGN_SQLSERVER update_mask requirement.

AngelShrestha
Databricks Partner

What I tried:Updating the description via UI (AI Suggested Description / manual edit

I’m running into an issue while trying to update the description for the schema.

Context:

  • Type: SCHEMA_FOREIGN_SQLSERVER

    Error message:

    Failed to save description. Please try again later.
    Securable `dbo` of type SCHEMA_FOREIGN_SQLSERVER can only be updated by specifying update_mask field with the list of fields to update (unless it's ownership transfer), and only the owner, managed_storage_root, securable_kind, predictive_optimization fields can be updated.
     
Angel Shrestha
5 REPLIES 5

Sumit_7
Honored Contributor II

Hey Shrestha,

It clearly seems to be a permission issue here as seen in the Error Message. Do contact your Workspace/Account admin.

Thanks.

emma_s
Databricks Employee
Databricks Employee

HI,

This isn't a permissions issue but a known limitation of  Lakehouse Federation. Foreign securables (like SCHEMA_FOREIGN_SQLSERVER) only allow a restricted set of metadata fields to be
updated — description and comment are not among them.

This is because foreign schema metadata is mirrored from the source system, and Unity Catalog currently treats it as read-only for most fields.

This has been raised as a feature request and is being tracked by the product team. If this is important for your use case, I'd recommend raising it with your Databricks account team so your request gets added to the prioritization.

Thanks,

Emma

Thank you @emma_s , 
In my knowledge if we use lakeflow connect, it gets treated as the Managed tables, 
Can you confirm if its true, and if we can use Lakeflow connect to connect to SQL Servers and use the metadata then?

Angel Shrestha

emma_s
Databricks Employee
Databricks Employee

Hi, Yes 100%, if you use Lakeflow connect, it will ingest the data and they will become managed tables. Which will support the descriptions and comments. You should also get some query improvement as you're actually moving the data rather than querying it in place.

Hi  @emma_s , 
Can you confirm, if I create catalog from here, will this be considered Lakeflow connect? 
Screenshot 2026-04-06 100525.png

 

Angel Shrestha