โ01-17-2024 10:36 AM
Hi,
I have an external table which reads data from a S3 bucket. The s3 bucket is expected to get new files frequently. With some changes to the underlying schema. I used Refresh Table table command to load new files from the s3 location and it worked fine. But when there are schema changes, either addition or deletion, the refresh table is not working,
Is it possible to refresh the metadata of the external table when there are schema changes? Should I alter the table each time there are changes to the schema? Could someone please help?
โ01-19-2024 09:10 AM
@Dp15 - yes you are correct. Dropping a column from an managed table in Databricks works different from the external table(as the schema is inferred by the underlying source). Below hack can help. AFAIK. Please let me know if this works for you.
1. create or replace new external table B on the new schema (new set of columns you want to keep) and new data source path
2. insert into new table B as select (required columns) from table A(old table).
3. Drop table A
4. Alter table - Rename table B to table A
โ01-17-2024 04:36 PM
@Dp15 - Please refer to the below illustration.
Please refer to the below doc for additional details - https://docs.databricks.com/en/delta/delta-column-mapping.html#streaming-with-column-mapping-and-sch...
โ01-18-2024 07:18 AM
HI @shan_chandra how about deletions from the external location? And what if I am not using a streaming table?
โ01-18-2024 08:29 AM
@Dp15 - you can drop column manually using the below
ALTER TABLE table_name DROP COLUMN col_name
1. Please note dropping a column from a metadata does not delete the underlying data for column in files.
2. Purging the data column can be done using REORG TABLE to rewrite files.
3. Use VACUUM to physically delete the files that contain the dropped column data.
Reference:
https://docs.databricks.com/en/delta/delta-column-mapping.html#drop-columns
https://docs.databricks.com/en/delta/update-schema.html#explicitly-update-schema-to-drop-columns
โ01-18-2024 01:27 PM
Hi @shan_chandra This drop works for a delta table which is managed table, however it does work for an external table, I am looking specifically for schema changes in external table, now a refresh might work to load new metadata in the external table, however when there are schema modifications, only addition of columns are possible dropping a column has not worked for me, Correct me if I am wrong here
โ01-19-2024 09:10 AM
@Dp15 - yes you are correct. Dropping a column from an managed table in Databricks works different from the external table(as the schema is inferred by the underlying source). Below hack can help. AFAIK. Please let me know if this works for you.
1. create or replace new external table B on the new schema (new set of columns you want to keep) and new data source path
2. insert into new table B as select (required columns) from table A(old table).
3. Drop table A
4. Alter table - Rename table B to table A
โ01-23-2024 07:38 AM
@shan_chandra This worked thank you
โ01-23-2024 07:48 AM - edited โ01-23-2024 07:48 AM
@Dp15 - I am glad it worked. Happy to help!!!
Join a Regional User Group to connect with local Databricks users. Events will be happening in your city, and you wonโt want to miss the chance to attend and share knowledge.
If there isnโt a group near you, start one and help create a community that brings people together.
Request a New Group