cancel
Showing results for 
Search instead for 
Did you mean: 
Data Engineering
cancel
Showing results for 
Search instead for 
Did you mean: 

Accessing confluent schema registry from databricks with scala fails with 401 (just for scala, not python, just in databricks)

alonisser
Contributor

Nore, I've tested with the same connection variable:

  1. locally with scala - works (via the same prod schema registry)
  2. in the cluster with python - works
  3. in the cluster with scala - fails with 401 auth error
def setupSchemaRegistry(schemaRegistryUrl: String, confluentRegistryApiKey: String, confluentRegistrySecret: String): CachedSchemaRegistryClient = {
 
   val props = Map(
      "basic.auth.credentials.source" -> "USER_INFO",
      "basic.auth.user.info" -> s"$confluentRegistryApiKey:$confluentRegistrySecret",
      "schema.registry.basic.auth.credentials.source" -> "USER_INFO", //tried both this version just to be sure
      "schema.registry.url" -> schemaRegistryUrl,
      "schema.registry.basic.auth.user.info" -> s"$confluentRegistryApiKey:$confluentRegistrySecret"
    ).asJava  
//   val restService = new RestService(schemaRegistryUrl)
    
 
    // The format of the schema registry credentials below is confluentRegistryApiKey:confluentRegistrySecret
    println(s"schema registry info $schemaRegistryUrl $confluentRegistryApiKey $confluentRegistrySecret")
    // also tried the version where I pass the restService inside instead, etc
    new CachedSchemaRegistryClient(schemaRegistryUrl, 100, props)
}
val schemaRegistry = setupSchemaRegistry(schemaRegistryUrl, confluentRegistryApiKey, confluentRegistrySecret)
val res = schemaRegistry.getSchemaById(number) //gets 401
 

databricks runtime : 9.1

confluent client 6.2.1

Why would it fail just in scala, just from databricks cluster? I can't explain that!

I'll be glad for any help

1 ACCEPTED SOLUTION

Accepted Solutions

alonisser
Contributor

Found the issue: it's the uber package mangling some dependency resolving, which I fixed

Another issue, is that currently you can't use 6.* branch of confluent schema registry client in databricks, because the avro version is different then the one supported in spark 3.1

View solution in original post

2 REPLIES 2

Anonymous
Not applicable

Hi @Alon Nisser​ - My name is Piper and I'm one of the community moderators. Welcome and thanks for your question. Let's give this a bit longer to see what the community says.

alonisser
Contributor

Found the issue: it's the uber package mangling some dependency resolving, which I fixed

Another issue, is that currently you can't use 6.* branch of confluent schema registry client in databricks, because the avro version is different then the one supported in spark 3.1

Welcome to Databricks Community: Lets learn, network and celebrate together

Join our fast-growing data practitioner and expert community of 80K+ members, ready to discover, help and collaborate together while making meaningful connections. 

Click here to register and join today! 

Engage in exciting technical discussions, join a group with your peers and meet our Featured Members.