<?xml version="1.0" encoding="UTF-8"?>
<rss xmlns:content="http://purl.org/rss/1.0/modules/content/" xmlns:dc="http://purl.org/dc/elements/1.1/" xmlns:rdf="http://www.w3.org/1999/02/22-rdf-syntax-ns#" xmlns:taxo="http://purl.org/rss/1.0/modules/taxonomy/" version="2.0">
  <channel>
    <title>topic Re: Foreign table to delta streaming table in Get Started Discussions</title>
    <link>https://community.databricks.com/t5/get-started-discussions/foreign-table-to-delta-streaming-table/m-p/99545#M8792</link>
    <description>&lt;P&gt;I also want to bump this! This is my exact problem right now as well.&lt;/P&gt;</description>
    <pubDate>Wed, 20 Nov 2024 17:31:36 GMT</pubDate>
    <dc:creator>sbiales</dc:creator>
    <dc:date>2024-11-20T17:31:36Z</dc:date>
    <item>
      <title>Foreign table to delta streaming table</title>
      <link>https://community.databricks.com/t5/get-started-discussions/foreign-table-to-delta-streaming-table/m-p/58251#M8789</link>
      <description>&lt;P&gt;I want to copy a table from a foreign catalog as my streaming table. This is the code I used but I am getting error:&amp;nbsp;&lt;SPAN&gt;Table&amp;nbsp;table_name does not support either micro-batch or continuous scan.;&lt;/SPAN&gt;&lt;/P&gt;&lt;P&gt;&amp;nbsp;&lt;/P&gt;&lt;DIV&gt;&lt;DIV&gt;&lt;SPAN&gt;spark.readStream&lt;/SPAN&gt;&lt;/DIV&gt;&lt;DIV&gt;&lt;SPAN&gt;&amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; .&lt;/SPAN&gt;&lt;SPAN&gt;table&lt;/SPAN&gt;&lt;SPAN&gt;(table_name)&lt;/SPAN&gt;&lt;/DIV&gt;&lt;DIV&gt;&lt;SPAN&gt;&amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; .writeStream&lt;/SPAN&gt;&lt;/DIV&gt;&lt;DIV&gt;&lt;SPAN&gt;&amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; .&lt;/SPAN&gt;&lt;SPAN&gt;trigger&lt;/SPAN&gt;&lt;SPAN&gt;(&lt;/SPAN&gt;&lt;SPAN&gt;availableNow&lt;/SPAN&gt;&lt;SPAN&gt;=&lt;/SPAN&gt;&lt;SPAN&gt;True&lt;/SPAN&gt;&lt;SPAN&gt;)&lt;/SPAN&gt;&lt;/DIV&gt;&lt;DIV&gt;&lt;SPAN&gt;&amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; .&lt;/SPAN&gt;&lt;SPAN&gt;option&lt;/SPAN&gt;&lt;SPAN&gt;(&lt;/SPAN&gt;&lt;SPAN&gt;"checkpointLocation"&lt;/SPAN&gt;&lt;SPAN&gt;, &lt;/SPAN&gt;&lt;SPAN&gt;"dbfs:/folder_path&lt;/SPAN&gt;&lt;SPAN&gt;"&lt;/SPAN&gt;&lt;SPAN&gt;)&lt;/SPAN&gt;&lt;/DIV&gt;&lt;DIV&gt;&lt;SPAN&gt;&amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; .&lt;/SPAN&gt;&lt;SPAN&gt;toTable&lt;/SPAN&gt;&lt;SPAN&gt;(new_table&lt;/SPAN&gt;&lt;SPAN&gt;)&lt;/SPAN&gt;&lt;/DIV&gt;&lt;DIV&gt;&amp;nbsp;&lt;/DIV&gt;&lt;/DIV&gt;</description>
      <pubDate>Tue, 23 Jan 2024 08:59:39 GMT</pubDate>
      <guid>https://community.databricks.com/t5/get-started-discussions/foreign-table-to-delta-streaming-table/m-p/58251#M8789</guid>
      <dc:creator>ksenija</dc:creator>
      <dc:date>2024-01-23T08:59:39Z</dc:date>
    </item>
    <item>
      <title>Re: Foreign table to delta streaming table</title>
      <link>https://community.databricks.com/t5/get-started-discussions/foreign-table-to-delta-streaming-table/m-p/96677#M8791</link>
      <description>&lt;P&gt;Bumping this thread because I have the same question and this is still the first result on Google (c. October 2024). Many thanks for anyone who is able to assist!&lt;/P&gt;</description>
      <pubDate>Tue, 29 Oct 2024 16:04:36 GMT</pubDate>
      <guid>https://community.databricks.com/t5/get-started-discussions/foreign-table-to-delta-streaming-table/m-p/96677#M8791</guid>
      <dc:creator>MLE123</dc:creator>
      <dc:date>2024-10-29T16:04:36Z</dc:date>
    </item>
    <item>
      <title>Re: Foreign table to delta streaming table</title>
      <link>https://community.databricks.com/t5/get-started-discussions/foreign-table-to-delta-streaming-table/m-p/99545#M8792</link>
      <description>&lt;P&gt;I also want to bump this! This is my exact problem right now as well.&lt;/P&gt;</description>
      <pubDate>Wed, 20 Nov 2024 17:31:36 GMT</pubDate>
      <guid>https://community.databricks.com/t5/get-started-discussions/foreign-table-to-delta-streaming-table/m-p/99545#M8792</guid>
      <dc:creator>sbiales</dc:creator>
      <dc:date>2024-11-20T17:31:36Z</dc:date>
    </item>
    <item>
      <title>Re: Foreign table to delta streaming table</title>
      <link>https://community.databricks.com/t5/get-started-discussions/foreign-table-to-delta-streaming-table/m-p/100154#M8793</link>
      <description>&lt;P&gt;What is the underlying type of the table you are trying to stream from? Structured Streaming does not currently support streaming reads via JDBC, so reading from MySQL, Postgres, etc are not supported.&lt;/P&gt;
&lt;P data-unlink="true"&gt;If you are trying to perform stream ingestion from such sources, we instead recommend using &lt;A href="https://docs.databricks.com/en/ingestion/lakeflow-connect/index.html#database-connector-components" target="_self"&gt;LakeFlow Connect&lt;/A&gt;&amp;nbsp;&lt;SPAN&gt;for supported sources.&lt;/SPAN&gt;&lt;/P&gt;
&lt;P&gt;Another alternative is to write a &lt;A href="https://spark.apache.org/docs/preview/api/python/user_guide/sql/python_data_source.html#python-data-source-api" target="_self"&gt;Python datasource&lt;/A&gt;&amp;nbsp;that performs these streaming reads.&lt;/P&gt;
&lt;P&gt;And of course, if your source supports exposing a change log of your data, like the binlog in MySQL or services like AWS DMS, you can set these up and use Databricks Autoloader for efficient incremental ingestion.&lt;/P&gt;</description>
      <pubDate>Tue, 26 Nov 2024 19:45:07 GMT</pubDate>
      <guid>https://community.databricks.com/t5/get-started-discussions/foreign-table-to-delta-streaming-table/m-p/100154#M8793</guid>
      <dc:creator>cgrant</dc:creator>
      <dc:date>2024-11-26T19:45:07Z</dc:date>
    </item>
  </channel>
</rss>

