Great blog post: https://community.databricks.com/t5/technical-blog/integrating-apache-spark-with-databricks-unity-ca...
I have attempted to reproduce this with Azure Databricks, and ADLS gen2 as the storage backend.
Although I'm able to interact with unity catalog (successful "use schema" and then "select(current_schema()") and so on, when I try to append rows to a newly created managed table (as in the example above), I get the error below.
It looks like the temporary credential supplied by UC is failing. Any ideas what could be wrong here?
java.nio.file.AccessDeniedException: Operation failed: "This request is not authorized to perform this operation using this permission.", 403, PUT, https://storageaccout-xyz.dfs.core.windows.net/some/dir/__unitycatalog/..."
Any ideas what I'm doing wrong? cc @dkushari
P.S. it looks like the TM in the title of your blog post is preventing anyone from commenting there.
P.P.S. this forum software is extremely painful. It won't accept the post without a label, but it makes it incredibly difficult to select any labels.