Ingest Data into Databricks with Kafka
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
09-04-2023 07:45 AM
I am trying to ingest data into Databricks with Kafka. I have Kafka installed in a Virtual Machine where I already have the data I need in a Kafka Topic stored as json. In Databricks, I have the following code:
```
df = (spark.readStream .format("kafka") .option("kafka.bootstrap.servers", "<VM_IP:9092>") .option("subscribe", "<topicName>") .load() )
```
Where the printed schema gives me:
```
|-- key: binary (nullable = true) |-- value: binary (nullable = true) |-- topic: string (nullable = true) |-- partition: integer (nullable = true) |-- offset: long (nullable = true) |-- timestamp: timestamp (nullable = true) |-- timestampType: integer (nullable = true)
```
Then I try to write the data to a delta table but the code for that only outputs 'Stream Initializing' and gets stuck there.
I would like to get some help because I cannot figure out what I am doing wrong or missing on this.
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
09-04-2023 08:18 AM - edited 09-04-2023 08:20 AM
Hi @Retired_mod, thanks for the answer. But I have a checkpoint location when writing. This is the code:
delta_table_path = "/mnt/delta-table-path"
df.writeStream \
.format("delta") \
.outputMode("append") \
.option("checkpointLocation", "/mnt/checkpoint-location")
.start(delta_table_path)
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
09-04-2023 08:34 AM
What about using hivestore in Databricks? And maybe that's an issue but I tried to make this pipeline ir order to process only one message and still got stuck in the stream initializing
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
09-06-2023 10:16 PM
Could we try display(df) after the readStream to see whether we are able to read data from Kafka. This will help us to eliminate the possibility of Kafka read issues.
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
09-07-2023 06:38 AM - edited 09-07-2023 06:39 AM
I also get stuck with this...
Could it be a problem of cluster memory? Or network issues related to the connection with the Virtual Machine?
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
09-08-2023 03:41 PM
you need to check the driver's logs when your streaming is initializing. Please check the log4j output for the driver's logs. If there is an issue connecting to your Kafka broker, you will be able to see it
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
09-11-2023 02:21 AM
Yeah, in fact when checking the log4j logs i have the following:
23/09/11 09:11:27 WARN NetworkClient: [Consumer clientId=consumer-spark-kafka-source-3e9266a6-081d-4946-b41e-38873d2b01c0--1036396469-driver-0-1, groupId=spark-kafka-source-3e9266a6-081d-4946-b41e-38873d2b01c0--1036396469-driver-0] Bootstrap broker VM_IP (id: -1 rack: null) disconnected
I added 'listeners = PLAINTEXT://VM_IP:9092' to kafka config (solution i saw when searched for the issue) but I am still having issues when trying to connect to the VM
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
09-11-2023 03:19 AM
Update: After changing the IP address to the external IP of the machine i get:
23/09/11 10:14:47 INFO AppInfoParser: Kafka version: 7.4.0-ccs 23/09/11 10:14:47 INFO AppInfoParser: Kafka commitId: 30969fa33c185e88 23/09/11 10:14:47 INFO AppInfoParser: Kafka startTimeMs: 1694427287346 23/09/11 10:14:47 INFO KafkaConsumer: [Consumer clientId=consumer-spark-kafka-source-51917966-dd8d-4b6b-9532-6076a916ea5b-998856815-driver-0-1, groupId=spark-kafka-source-51917966-dd8d-4b6b-9532-6076a916ea5b-998856815-driver-0] Subscribed to topic(s): <topicName>
But soon after it closes the connection again...

