3 weeks ago
py4j.protocol.Py4JJavaError: An error occurred while calling o123.sql.
: io.unitycatalog.client.ApiException: generateTemporaryPathCredentials call failed with: 401 - {"error_code":"UNAUTHENTICATED","message":"Request to generate access credential for path 'abfss://ext-location@databrickspocdixon.dfs.core.windows.net/tables/demoTab3' from outside of Databricks Unity Catalog enabled compute environment is denied for security. Please contact Databricks support for integrations with Unity Catalog.","details":[{"@type":"type.googleapis.com/google.rpc.ErrorInfo","reason":"UNITY_CATALOG_EXTERNAL_GENERATE_PATH_CREDENTIALS_DENIED","domain":"unity-catalog.databricks.com","metadata":{"path":"abfss://ext-location@databrickspocdixon.dfs.core.windows.net/tables/demoTab3"}},{"@type":"type.googleapis.com/google.rpc.RequestInfo","request_id":"dafd42ef-1dd1-4225-b41e-ac44be07e6d5","serving_data":""}]}
at io.unitycatalog.client.api.TemporaryCredentialsApi.getApiException(TemporaryCredentialsApi.java:78)
at io.unitycatalog.client.api.TemporaryCredentialsApi.generateTemporaryPathCredentialsWithHttpInfo(TemporaryCredentialsApi.java:192)
at io.unitycatalog.client.api.TemporaryCredentialsApi.generateTemporaryPathCredentials(TemporaryCredentialsApi.java:170)
at io.unitycatalog.spark.UCSingleCatalog.createTable(UCSingleCatalog.scala:106)
at org.apache.spark.sql.execution.datasources.v2.CreateTableExec.run(CreateTableExec.scala:44)
at org.apache.spark.sql.execution.datasources.v2.V2CommandExec.result$lzycompute(V2CommandExec.scala:43)
3 weeks ago - last edited 3 weeks ago
Hello @dixonantony ,
Have you followed the steps here - https://docs.databricks.com/en/data-governance/unity-catalog/access-open-api.html#external-data-acce...
Thanks!
3 weeks ago
Also how are you creating the table,
can you please try the below option
df.write.insertInto
, or the v2 version df.writeTo(tbl_name).append()
df.write.saveAsTable
does not fully support custom catalogs.
3 weeks ago - last edited 3 weeks ago
Thanks for the reply, I was following below doc and select table is working, issue is with create table.
spark.sql(f"create table if not exists {catalog}.{databasename}.testTable(id INTEGER, name VARCHAR(10), age INTEGER)").show()
Same issue is listed in https://github.com/unitycatalog/unitycatalog/issues/560 but not sure what exactly is the SAFE enabling.
3 weeks ago
I see thanks for sharing, for the safe flag - You can work with your Databricks Point of Contact to get help on enabling the SAFE flag for the workspace.
3 weeks ago
Can you also try this validate on the privileges please
Grant CREATE EXTERNAL TABLE
on the external location to create a delta table. on UNITY_CATALOG_EXTERNAL_GENERATE_PATH_CREDENTIALS_DENIED
because that privilege is blocked behind.
3 weeks ago
I already tried CREATE EXTERNAL TABLE , but not working.
Join a Regional User Group to connect with local Databricks users. Events will be happening in your city, and you wonโt want to miss the chance to attend and share knowledge.
If there isnโt a group near you, start one and help create a community that brings people together.
Request a New Group