- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
04-26-2022 08:57 AM
I am reading data from a Kafka topic, say topic_a.
I have an application, app_one which uses Spark Streaming to read data from topic_a. I have a checkpoint location, loc_a to store the checkpoint.
Now, app_one has read data till offset 90.
Can I create a new notebook, say app_two and then copy the checkpoint folder and point the app_two to this copied checkpoint folder. Will this start reading from offset 90?
- Labels:
-
Kafka consumer
-
Spark
-
Structured streaming
Accepted Solutions
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
06-14-2022 05:21 PM
Hi @John Constantine,
Is not recommended to share the checkpoint with your queries. Every streaming query should have their own checkpoint. If you can to start at the offset 90 in another query, then you can define it when starting your job. You can mentioned that you want your query to start processing at a customer offset.
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
06-14-2022 05:21 PM
Hi @John Constantine,
Is not recommended to share the checkpoint with your queries. Every streaming query should have their own checkpoint. If you can to start at the offset 90 in another query, then you can define it when starting your job. You can mentioned that you want your query to start processing at a customer offset.

