cancel
Showing results for 
Search instead for 
Did you mean: 
Community Platform Discussions
Connect with fellow community members to discuss general topics related to the Databricks platform, industry trends, and best practices. Share experiences, ask questions, and foster collaboration within the community.
cancel
Showing results for 
Search instead for 
Did you mean: 

Alter table to add/update multiple column comments

bharathjs
Visitor

I was wondering if there's a way to alter table and add/update comments for multiple columns at once using SQL or API calls. For instance - 

 

ALTER TABLE <table_name> CHANGE COLUMN <col1> COMMENT '<comment1>',
CHANGE COLUMN <col2> COMMENT 'comment2' ; 
 
I would like to update column comments for multiple tables for over 200 fields and flexible using sql, pyspark, apis. Would be great if there is a way to achieve this. Thanks in advance...   
 
2 REPLIES 2

filipniziol
Contributor

Hi, assuming you have a dictionary with column name to comment mapping, you can do this using pyspark like this:

columns_comments = {
    "col1": "comment1",
    "col2": "comment2",
    # Add all your columns and comments here
}

for col, comment in columns_comments.items():
    spark.sql(f"ALTER TABLE your_table_name CHANGE COLUMN {col} SET COMMENT '{comment}'")

Yeah I did consider this approach but one drawback I observed with this approach is it takes a while to updated comments for ~200 columns. I'm looking for more reliable solution to update column's comments concurrently.  

Connect with Databricks Users in Your Area

Join a Regional User Group to connect with local Databricks users. Events will be happening in your city, and you won’t want to miss the chance to attend and share knowledge.

If there isn’t a group near you, start one and help create a community that brings people together.

Request a New Group