- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
01-04-2023 11:55 PM
I'm working on the case to configure Kafka that is installed on my machine (Laptop) & I want to connect it with my Databricks account hosted on the AWS cloud.
Secondly, I have CSV files that I want to use for real-time processing from Kafka to Databrick dbfs, where I'll process the data coming live from Kafka.
Qasim Hassan
Data Engineer @Royal Cyber
Databricks UG Pakistan Lead
- Labels:
-
AWS
-
Kafka
-
Real time data
Accepted Solutions
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
01-05-2023 02:20 AM
For CSV, you need just to readStream in the notebook and append output to CSV using forEachBatch method.
Your Kafka on PC needs to have the public address or you need to set AWS VPN and connect from your laptop to be in the same VPC as databricks.
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
01-05-2023 02:20 AM
For CSV, you need just to readStream in the notebook and append output to CSV using forEachBatch method.
Your Kafka on PC needs to have the public address or you need to set AWS VPN and connect from your laptop to be in the same VPC as databricks.
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
01-05-2023 08:11 AM
Hi,
@Hubert Dudek can you give me some reference material or video to take help?
Qasim Hassan
Data Engineer @Royal Cyber
Databricks UG Pakistan Lead

