Hi @rak_haq
Here’s how you can read from an Azure Event Hub’s Kafka endpoint in Databricks SQL—using the built-in read_kafka table function and the secret() function to inject your connection string securely:
In summary, you must:
1. Point bootstrapServers at your Event Hubs Kafka endpoint (<NAMESPACE>.servicebus.windows.net:9093).
2. Tell Kafka to use SASL_SSL with PLAIN and pass the full Event Hubs connection string (including EntityPath) via kafka.sasl.jaas.config.
3. Retrieve that connection string from a Databricks Secret with secret('scope','key') and wrap it correctly in the JAAS login module config.
4.Include the required Kafka options (security.protocol, sasl.mechanism, sasl.jaas.config) in your read_kafka call.
Example SQL
SELECT
CAST(value AS STRING) AS raw_json,
current_timestamp() AS processing_time
FROM read_kafka(
bootstrapServers => '<YOUR_NAMESPACE>.servicebus.windows.net:9093',
subscribe => '<YOUR_EVENTHUB_NAME>',
`kafka.security.protocol`=> 'SASL_SSL',
`kafka.sasl.mechanism` => 'PLAIN',
`kafka.sasl.jaas.config` => concat(
'org.apache.kafka.common.security.plain.PlainLoginModule required ',
'username="$ConnectionString" ',
'password="', secret('myScope','eventHubConnection'), '";'
)
)
LIMIT 10;
LR