Yes, that would work. However, it is a longish Snowflake query producing a number of tables that are all called by the Databricks notebook, so it requires quite a few changes. I'll use this alternative if I automate the process.
However, I think this is a serious issue that deserves a warning from Databricks when using snowflake connector. One implicitly trusts that the connection will work, and there is no reason programmers will limit their snowflake changes to the particular ongoing connection.
In any case, under the hood, I imagine a connection engine has been created that could be closed and reopened. Maybe one could access that engine with standard snowflake sqlalchemy commands from the notebook?