<?xml version="1.0" encoding="UTF-8"?>
<rss xmlns:content="http://purl.org/rss/1.0/modules/content/" xmlns:dc="http://purl.org/dc/elements/1.1/" xmlns:rdf="http://www.w3.org/1999/02/22-rdf-syntax-ns#" xmlns:taxo="http://purl.org/rss/1.0/modules/taxonomy/" version="2.0">
  <channel>
    <title>topic Re: Writing to Snowflake from Databricks - sqlalchemy replacement? in Data Engineering</title>
    <link>https://community.databricks.com/t5/data-engineering/writing-to-snowflake-from-databricks-sqlalchemy-replacement/m-p/64320#M32536</link>
    <description>&lt;P&gt;&lt;a href="https://community.databricks.com/t5/user/viewprofilepage/user-id/102604"&gt;@brian999&lt;/a&gt;&amp;nbsp; -&amp;nbsp; spark-snowflake connector is inbuilt into the DBR. Please refer to the below article for examples.&amp;nbsp;&lt;/P&gt;
&lt;P&gt;&lt;A href="https://docs.databricks.com/en/connect/external-systems/snowflake.html#read-and-write-data-from-snowflake" target="_blank"&gt;https://docs.databricks.com/en/connect/external-systems/snowflake.html#read-and-write-data-from-snowflake&lt;/A&gt;&lt;/P&gt;
&lt;P&gt;Please let us know if this helps&lt;/P&gt;</description>
    <pubDate>Thu, 21 Mar 2024 18:10:06 GMT</pubDate>
    <dc:creator>shan_chandra</dc:creator>
    <dc:date>2024-03-21T18:10:06Z</dc:date>
    <item>
      <title>Writing to Snowflake from Databricks - sqlalchemy replacement?</title>
      <link>https://community.databricks.com/t5/data-engineering/writing-to-snowflake-from-databricks-sqlalchemy-replacement/m-p/64315#M32531</link>
      <description>&lt;P&gt;I am trying to migrate some complex python load processes into databricks. Our load processes currently use pandas and we're hoping to refactor into Spark soon. For now, I need to figure out how to alter our functions that get sqlalchemy connection engines so I can bring our libraries that use sqlalchemy over to databricks. I see that there is a databricks sqlalchemy library, but there also seems to be a fairly strong option for connecting to Snowflake using a spark session and a JDBC(I think?) connector. Do these spark JDBC sessions work in a similar way to sqlalchemy connection sessions?&amp;nbsp;&amp;nbsp;&lt;/P&gt;&lt;P&gt;&amp;nbsp;&lt;/P&gt;&lt;P&gt;Has anyone ever done this before? What is the well-traveled path on this kind of migration?&lt;/P&gt;</description>
      <pubDate>Thu, 21 Mar 2024 17:31:01 GMT</pubDate>
      <guid>https://community.databricks.com/t5/data-engineering/writing-to-snowflake-from-databricks-sqlalchemy-replacement/m-p/64315#M32531</guid>
      <dc:creator>brian999</dc:creator>
      <dc:date>2024-03-21T17:31:01Z</dc:date>
    </item>
    <item>
      <title>Re: Writing to Snowflake from Databricks - sqlalchemy replacement?</title>
      <link>https://community.databricks.com/t5/data-engineering/writing-to-snowflake-from-databricks-sqlalchemy-replacement/m-p/64320#M32536</link>
      <description>&lt;P&gt;&lt;a href="https://community.databricks.com/t5/user/viewprofilepage/user-id/102604"&gt;@brian999&lt;/a&gt;&amp;nbsp; -&amp;nbsp; spark-snowflake connector is inbuilt into the DBR. Please refer to the below article for examples.&amp;nbsp;&lt;/P&gt;
&lt;P&gt;&lt;A href="https://docs.databricks.com/en/connect/external-systems/snowflake.html#read-and-write-data-from-snowflake" target="_blank"&gt;https://docs.databricks.com/en/connect/external-systems/snowflake.html#read-and-write-data-from-snowflake&lt;/A&gt;&lt;/P&gt;
&lt;P&gt;Please let us know if this helps&lt;/P&gt;</description>
      <pubDate>Thu, 21 Mar 2024 18:10:06 GMT</pubDate>
      <guid>https://community.databricks.com/t5/data-engineering/writing-to-snowflake-from-databricks-sqlalchemy-replacement/m-p/64320#M32536</guid>
      <dc:creator>shan_chandra</dc:creator>
      <dc:date>2024-03-21T18:10:06Z</dc:date>
    </item>
    <item>
      <title>Re: Writing to Snowflake from Databricks - sqlalchemy replacement?</title>
      <link>https://community.databricks.com/t5/data-engineering/writing-to-snowflake-from-databricks-sqlalchemy-replacement/m-p/64324#M32538</link>
      <description>&lt;P&gt;Thank you for your response. I am familiar with this documentation, which is rather sparse and the db write part is only in scala from what I can tell. I need to know if, when I start that spark session the session with snowflake stays open, because we create temp tables that rely on being used within the same connection session.&amp;nbsp; Also, I'd like to know see any example code of using the databricks sqlalchemy library to connect to snowflake.&lt;/P&gt;</description>
      <pubDate>Thu, 21 Mar 2024 18:16:50 GMT</pubDate>
      <guid>https://community.databricks.com/t5/data-engineering/writing-to-snowflake-from-databricks-sqlalchemy-replacement/m-p/64324#M32538</guid>
      <dc:creator>brian999</dc:creator>
      <dc:date>2024-03-21T18:16:50Z</dc:date>
    </item>
    <item>
      <title>Re: Writing to Snowflake from Databricks - sqlalchemy replacement?</title>
      <link>https://community.databricks.com/t5/data-engineering/writing-to-snowflake-from-databricks-sqlalchemy-replacement/m-p/64698#M32634</link>
      <description>&lt;P&gt;&lt;a href="https://community.databricks.com/t5/user/viewprofilepage/user-id/102604"&gt;@brian999&lt;/a&gt;&amp;nbsp; - Below is an example of spark snowflake connector using python.&amp;nbsp;&lt;/P&gt;
&lt;P&gt;&lt;A href="https://docs.databricks.com/_extras/notebooks/source/snowflake-python.html" target="_blank"&gt;https://docs.databricks.com/_extras/notebooks/source/snowflake-python.html&lt;/A&gt;&lt;/P&gt;
&lt;P&gt;&amp;nbsp;&lt;/P&gt;</description>
      <pubDate>Tue, 26 Mar 2024 18:33:26 GMT</pubDate>
      <guid>https://community.databricks.com/t5/data-engineering/writing-to-snowflake-from-databricks-sqlalchemy-replacement/m-p/64698#M32634</guid>
      <dc:creator>shan_chandra</dc:creator>
      <dc:date>2024-03-26T18:33:26Z</dc:date>
    </item>
  </channel>
</rss>

