cancel
Showing results forย 
Search instead forย 
Did you mean:ย 
Data Engineering
Join discussions on data engineering best practices, architectures, and optimization strategies within the Databricks Community. Exchange insights and solutions with fellow data engineers.
cancel
Showing results forย 
Search instead forย 
Did you mean:ย 

Accessing confluent schema registry from databricks with scala fails with 401 (just for scala, not python, just in databricks)

alonisser
Contributor

Nore, I've tested with the same connection variable:

  1. locally with scala - works (via the same prod schema registry)
  2. in the cluster with python - works
  3. in the cluster with scala - fails with 401 auth error
def setupSchemaRegistry(schemaRegistryUrl: String, confluentRegistryApiKey: String, confluentRegistrySecret: String): CachedSchemaRegistryClient = {
 
   val props = Map(
      "basic.auth.credentials.source" -> "USER_INFO",
      "basic.auth.user.info" -> s"$confluentRegistryApiKey:$confluentRegistrySecret",
      "schema.registry.basic.auth.credentials.source" -> "USER_INFO", //tried both this version just to be sure
      "schema.registry.url" -> schemaRegistryUrl,
      "schema.registry.basic.auth.user.info" -> s"$confluentRegistryApiKey:$confluentRegistrySecret"
    ).asJava  
//   val restService = new RestService(schemaRegistryUrl)
    
 
    // The format of the schema registry credentials below is confluentRegistryApiKey:confluentRegistrySecret
    println(s"schema registry info $schemaRegistryUrl $confluentRegistryApiKey $confluentRegistrySecret")
    // also tried the version where I pass the restService inside instead, etc
    new CachedSchemaRegistryClient(schemaRegistryUrl, 100, props)
}
val schemaRegistry = setupSchemaRegistry(schemaRegistryUrl, confluentRegistryApiKey, confluentRegistrySecret)
val res = schemaRegistry.getSchemaById(number) //gets 401
 

databricks runtime : 9.1

confluent client 6.2.1

Why would it fail just in scala, just from databricks cluster? I can't explain that!

I'll be glad for any help

1 ACCEPTED SOLUTION

Accepted Solutions

alonisser
Contributor

Found the issue: it's the uber package mangling some dependency resolving, which I fixed

Another issue, is that currently you can't use 6.* branch of confluent schema registry client in databricks, because the avro version is different then the one supported in spark 3.1

View solution in original post

2 REPLIES 2

Anonymous
Not applicable

Hi @Alon Nisserโ€‹ - My name is Piper and I'm one of the community moderators. Welcome and thanks for your question. Let's give this a bit longer to see what the community says.

alonisser
Contributor

Found the issue: it's the uber package mangling some dependency resolving, which I fixed

Another issue, is that currently you can't use 6.* branch of confluent schema registry client in databricks, because the avro version is different then the one supported in spark 3.1

Connect with Databricks Users in Your Area

Join a Regional User Group to connect with local Databricks users. Events will be happening in your city, and you wonโ€™t want to miss the chance to attend and share knowledge.

If there isnโ€™t a group near you, start one and help create a community that brings people together.

Request a New Group