Hello,
Our company is POCing the Unity Catalog with Azure as provider.
We have 2 subscriptions that contains 1 databricks each and 1 ADLS GEN2 each.
Initially we have the default `hive_metastore` connected to the ADLS GEN2. I've created a secret scope and inside the SQL/Spark configuration, I've added all the information about the authentification for the storage account. This is working, we have schema/tables that are on this Azure Storage Account.
With Unity catalog we wish to share data between each Databricks. So we created a databricks connector and a new container inside the same storage account. So we will have to migrate the data from the first container of `hive_metastore` to the new container created for databricks connector.
We were able to create a catalog, and with the UI I was able to create a new schema inside this catalog. I was able to create a table and insert data (even if it seems pretty long to insert 3-4 lines)
Now I go on the other databricks from the other subscription, I've made the configuration to access the catalog. But when making a select, the query does not finish nor fail. It is stuck in `running`.
In my workspace A and B, I'm connected with the same account, and I'm owner of the schema/table. So it's not a role permission.
In workspace A I have spark configuration to access storage account of Subscription A
In workspace B I have spark configuration to access storage account of Subscription B, but I don't have the configuration to access the storage account of Subscription A. Could this be the issue? If yes, then I could have create the schema/table direclty in the hive_metastore without unity calatog. So I guess it's not this.