How to restart snowflake connector?
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
02-20-2023 06:51 PM
After using spark.read.format("snowflake").options(**options).option("dbtable", "table_name").load() to read a table from Snowflake, when I then change the table from Snowflake and read it again, it gives me the first version of the table. I have worked around the problem by restarting the cluster. Is there a better way? Maybe restarting the snowflake connector or configuring it differently?
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
02-24-2023 04:51 PM
Yes, that would work. However, it is a longish Snowflake query producing a number of tables that are all called by the Databricks notebook, so it requires quite a few changes. I'll use this alternative if I automate the process.
However, I think this is a serious issue that deserves a warning from Databricks when using snowflake connector. One implicitly trusts that the connection will work, and there is no reason programmers will limit their snowflake changes to the particular ongoing connection.
In any case, under the hood, I imagine a connection engine has been created that could be closed and reopened. Maybe one could access that engine with standard snowflake sqlalchemy commands from the notebook?

