Options
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
3 weeks ago - last edited 3 weeks ago
We have a pipeline in a job which dynamically creates a set of streaming tables based on a list of kafka topics like this:
# inside a loop
def topic_flow(topic_name=topic_name):
return (
read_from_kafka(important_parameters)
)
But it seems as if there is a change to some metadata inside `markdown_info` that the underlying table is not updated when the pipeline restarts. Is this known / expected behaviour? Is there any way to update the comment without having to recreate the table?