Thank you for your response. I am familiar with this documentation, which is rather sparse and the db write part is only in scala from what I can tell. I need to know if, when I start that spark session the session with snowflake stays open, because we create temp tables that rely on being used within the same connection session.  Also, I'd like to know see any example code of using the databricks sqlalchemy library to connect to snowflake.