Foreign table to delta streaming table
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
01-23-2024 12:59 AM
I want to copy a table from a foreign catalog as my streaming table. This is the code I used but I am getting error: Table table_name does not support either micro-batch or continuous scan.;
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
10-29-2024 09:04 AM
Bumping this thread because I have the same question and this is still the first result on Google (c. October 2024). Many thanks for anyone who is able to assist!
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
11-20-2024 09:31 AM
I also want to bump this! This is my exact problem right now as well.
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
11-26-2024 11:24 AM - edited 11-26-2024 11:45 AM
What is the underlying type of the table you are trying to stream from? Structured Streaming does not currently support streaming reads via JDBC, so reading from MySQL, Postgres, etc are not supported.
If you are trying to perform stream ingestion from such sources, we instead recommend using LakeFlow Connect for supported sources.
Another alternative is to write a Python datasource that performs these streaming reads.
And of course, if your source supports exposing a change log of your data, like the binlog in MySQL or services like AWS DMS, you can set these up and use Databricks Autoloader for efficient incremental ingestion.

