Nore, I've tested with the same connection variable:
- locally with scala - works (via the same prod schema registry)
- in the cluster with python - works
- in the cluster with scala - fails with 401 auth error
def setupSchemaRegistry(schemaRegistryUrl: String, confluentRegistryApiKey: String, confluentRegistrySecret: String): CachedSchemaRegistryClient = {
val props = Map(
"basic.auth.credentials.source" -> "USER_INFO",
"basic.auth.user.info" -> s"$confluentRegistryApiKey:$confluentRegistrySecret",
"schema.registry.basic.auth.credentials.source" -> "USER_INFO", //tried both this version just to be sure
"schema.registry.url" -> schemaRegistryUrl,
"schema.registry.basic.auth.user.info" -> s"$confluentRegistryApiKey:$confluentRegistrySecret"
).asJava
// val restService = new RestService(schemaRegistryUrl)
// The format of the schema registry credentials below is confluentRegistryApiKey:confluentRegistrySecret
println(s"schema registry info $schemaRegistryUrl $confluentRegistryApiKey $confluentRegistrySecret")
// also tried the version where I pass the restService inside instead, etc
new CachedSchemaRegistryClient(schemaRegistryUrl, 100, props)
}
val schemaRegistry = setupSchemaRegistry(schemaRegistryUrl, confluentRegistryApiKey, confluentRegistrySecret)
val res = schemaRegistry.getSchemaById(number) //gets 401
databricks runtime : 9.1
confluent client 6.2.1
Why would it fail just in scala, just from databricks cluster? I can't explain that!
I'll be glad for any help