Structured Streaming using Delta as Source and Delta as Sink and Delta tables are under unity catalo
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
05-01-2024 08:49 AM
Hello Everyone,
Here is my use case.
1. My source table (bronze delta table) is under unity catalog and is a transaction (Insert/Update) table.
2. My target table (silver delta table) is also under unity catalog.
3. On daily basis I need to ingest the incremental data from Bronze to silver table that is under unity catalog.
I get this error when reading source/Bronze table that is under unity catalog. I am using Shared Cluster and DBR 11.3 LTS.
Method public org.apache.spark.sql.Dataset org.apache.spark.sql.streaming.DataStreamReader.table(java.lang.String) is not whitelisted on class class org.apache.spark.sql.streaming.DataStreamReader
Appreciate any pointers to address and resolve this error.
Kind Regards,
Karthik
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
05-01-2024 01:05 PM
I came across this article : readStream() is not whitelisted error when running a query - Databricks
it states the solution as " You should use a cluster that does not have table access control enabled for streaming queries."
However, the source and target tables are under unity catalog. Does this mean we cannot do structured streaming implementation with Delta table as streaming source??
Appreciate your input and time.
Kind Regards
Karthik

