cancel
Showing results for 
Search instead for 
Did you mean: 
gudurusreddy99
New Contributor II
since ‎10-23-2025
a month ago

User Stats

  • 4 Posts
  • 0 Solutions
  • 0 Kudos given
  • 0 Kudos received

User Activity

RequirementI have a Kafka streaming pipeline that ingests Pixels data. For each incoming record, I need to validate the Pixels key against an existing Delta table (pixel_tracking_data), which contains over 2 billion records accumulated over the past ...
Databricks DLT Joins: Streaming table join with Delta table is reading 2 Billion records from Delta Table for each and every Micro batch.How to overcome this issue to not to read 2 Billion records for every micro batch.Your suggestions and feedback w...