Hello,
We recently detected an issue in our product deployment with terraform.
At some point, we have some java code that creates a schema in "hive_metastore" catalog.
Since "hive_metastore" catalog is the default one, there should not be any need to specify it.
It is the implementation that has been done in the java code (catalog is not specified).
Till recently, this code was working properly.
But now, the same code complains about empty catalog.
If I explicitly set the catalog name in the java code, then the code works fine again.
The problem is that this code is in production.
Does this new behavior is expected? Or is it a regression at some point in Databricks?
Regards,
Loïc