I successfully registered in my Unity Catalog an external Database ```dwcore``` that is hosted on SQL server.
I first added the connection in "External Data": tested the connection and it was successful.
I then added the database on top: tested the connection and it was successful.
Then on the unity catalog explorer I checked I saw that the tables can be seen under the "dbo" schema.
I can see all table listed on the tree on the left.
But then, if I try to query any table, the table is not found by Spark.
I tested with another postgres database and everything works like a charm.
I think the issue is this: my SQL server table names are case sensitive.
So for example I have a table "AllAlarms" that in the external catalog gets registered all lowercase.
It becomes "allalarms".
So when I try to query the table in the unity catalog the real SQL table does not get resolved, because I'm querying with a lowercase name when it should have been registered registering its real case.
I tried as well to set Spark to be
spark.sql.caseSensitive', True
but that does not help, because the error occurs when the table is registered in the catalog.
So I think this is a bug with Databricks.