โ07-08-2024 01:29 PM
Hello,
Wanted to Know if we can write the stream output to a Kafka topic in DLT pipeline?
Please let me know.
Thankyou.
โ07-08-2024 04:03 PM
Hi,
Yes, you can write the stream output to a Kafka topic in a Databricks Delta Live Tables (DLT) pipeline. Hereโs how you can do it:
Hereโs a basic example in PySpark:
โ
โ09-18-2024 09:17 AM
Is it possible to have 2 notebooks in a DLT pipeline, with the first notebook reading from topic1 in Kafka and writing to a DLT and the second notebook reading from this DLT, applying some data transformations and write streaming to a topic2 in Kafka? All in streaming mode?
โ07-09-2024 09:28 AM - edited โ07-09-2024 09:42 AM
Any sample code snippet for connecting to ScramLoginModule
I'm using below code to push the data to kafka topic and getting error saying
โ07-10-2024 04:04 AM - edited โ07-10-2024 04:19 AM
Hi !
Ensure your code is set up to use these libraries. Here is the complete example:
df1 = df.selectExpr("CAST(null AS STRING) as key", "to_json(struct(*)) AS value")
df1.writeStream \
.format("kafka") \
.option("kafka.bootstrap.servers", "your_broker_details") \
.option("kafka.security.protocol", "SASL_SSL") \
.option("kafka.sasl.mechanism", "SCRAM-SHA-512") \
.option("kafka.sasl.jaas.config", "org.apache.kafka.common.security.scram.ScramLoginModule required username='your_username' password='your_password';") \
.option("topic", "your_topic1") \
.option("checkpointLocation", "/path/to/your/checkpoint") \
.option("kafka.metadata.max.age.ms", "120000") \
.start()
Mehdi TAJMOUATI
Passionate about hosting events and connecting people? Help us grow a vibrant local communityโsign up today to get started!
Sign Up Now