Greetings.
We currently have a Spark structured streaming job (Scala) retrieving avro data from an Azure Eventhub with a confluent schema registry endpoint (using an Azure Api Management gateway with certificate authentication).
Until now the .jks files used by the Databricks consumer were retrieved by mounting the storage account into the Databricks workspace while configuring the from_avro() options as follows:
val fromAvroOptions = new java.util.HashMap[String, String]()
fromAvroOptions.put("mode", "PERMISSIVE")
fromAvroOptions.put("confluent.schema.registry.ssl.truststore.location", "/dbfs/mnt/keystores/Client_Cert.truststore.jks")
fromAvroOptions.put("confluent.schema.registry.ssl.truststore.password", truststorePass)
fromAvroOptions.put("confluent.schema.registry.ssl.keystore.location", "/dbfs/mnt/keystores/Client_Cert.keystore.jks")
fromAvroOptions.put("confluent.schema.registry.ssl.keystore.password", keystorePass)
fromAvroOptions.put("confluent.schema.registry.ssl.key.password", keyPass)
We decided to migrate the storage account to Unity Catalog external volumes in order to access the .jks files (ref), which is supposed to work.
The initial handshake and authentication is achieved and a successful request is logged to the APIM logs:
However while trying to display the data the following error occurs:
The compute configuration we used is the following:
Unity Catalog enabled single user access cluster (single node 14.3 LTS)
+ com.microsoft.azure:azure-eventhubs-spark_2.12:2.3.22.
Full privileges have also be granted on the catalog/schema/volume levels on the user.
When attempting to read the data with a kafka consumer no exception is thrown, but it is logged in the log4j output while not being able to decode any messages:
Using a Shared cluster with 15.4 LTS seems to yield similar errors.
Any help would be appreciated. Thanks in advance.