cancel
Showing results for 
Search instead for 
Did you mean: 
Data Engineering
cancel
Showing results for 
Search instead for 
Did you mean: 

Delta Live Tables error with Kafka SSL

mwoods
New Contributor III

We have a spark streaming job that consumes data from a Kafka topic and writes out to delta tables in Unity Catalog.

Looking to refactor it to use Delta Live Tables, but it appears that it is not possible at present to have a DLT Pipeline that can access Unity Catalog, and authenticate with kafka over SSL.

Our code for establishing the readStream is as follows:

 

df = (spark.readStream
    .format('kafka')
    .option('subscribe', _KAFKA_TOPIC)
    .option('kafka.bootstrap.servers', _KAFKA_BROKER)
    .option('kafka.security.protocol', 'SSL')
    .option('kafka.ssl.truststore.location', f'{_AUTH_KAFKA_URL}client.truststore.jks')
    .option('kafka.ssl.truststore.password', dbutils.secrets.get(scope='auth-secret-scope', key='kafka.ssl.truststore.password'))
    .option('kafka.ssl.truststore.type', 'JKS')
    .option('kafka.ssl.keystore.location', f'{_AUTH_KAFKA_URL}client.keystore.p12')
    .option('kafka.ssl.keystore.password', dbutils.secrets.get(scope='auth-secret-scope', key='kafka.ssl.keystore.password'))
    .option('kafka.ssl.keystore.type', 'PKCS12')
    .option('startingOffsets', 'earliest')
    .load()
    .withColumn('key', f.col('key').cast(t.StringType()))
    .withColumn('value', f.col('value').cast(t.StringType())))

 

But running the pipeline results in the following error:

AnalysisException: [UNSUPPORTED_STREAMING_OPTIONS_PERMISSION_ENFORCED] Streaming options kafka.ssl.truststore.location, kafka.ssl.keystore.location are not supported for data source kafka on a shared cluster.

Is it planned that in future DLT Pipelines might be able to be executed on something other than a shared cluster and write to Unity Catalog? Or to support the kafka.ssl.truststore.location and kafka.ssl.keystore.location options on shared clusters?

2 REPLIES 2

Harrison_S
New Contributor III
New Contributor III

Hello, I believe this is already worked on and should be in the next release for DLT pipelines.

gabriall
New Contributor II

Indeed its already patched. you just have to configure your pipeline on the "preview" channel.