We're delivering pipelines that are mostly based on Databricks Spark Streaming, Delta Lake and Azure Event Hubs, and there's a requirement to integrate with AMQ/JMS endpoints (Request and Response queues in ActiveMQ).
Is there a proven way to integrate them as sources and sinks in Spark streaming jobs?
I could use Kafka Connect, Flink or custom code (CAML/Akka) to sync those queues to Event Hubs, but if a tested solution exists, customer prefers to avoid extra dependencies.