Hi,
I have a spark streaming job which reads from kafka and process data and write to delta lake.
Number of kafka partition: 100
number of executor: 2 (4 core each)
So we have 8 cores total which are reading from 100 partitions of a topic. I wanted to understand if spark internally spin up muliple threads to reads from multiple partitions in parallel? if not is there any way to spin up multiple threads for kafka consumer.