11-27-2024 09:22 PM
py4j.protocol.Py4JJavaError: An error occurred while calling o123.sql.
: io.unitycatalog.client.ApiException: generateTemporaryPathCredentials call failed with: 401 - {"error_code":"UNAUTHENTICATED","message":"Request to generate access credential for path 'abfss://ext-location@databrickspocdixon.dfs.core.windows.net/tables/demoTab3' from outside of Databricks Unity Catalog enabled compute environment is denied for security. Please contact Databricks support for integrations with Unity Catalog.","details":[{"@type":"type.googleapis.com/google.rpc.ErrorInfo","reason":"UNITY_CATALOG_EXTERNAL_GENERATE_PATH_CREDENTIALS_DENIED","domain":"unity-catalog.databricks.com","metadata":{"path":"abfss://ext-location@databrickspocdixon.dfs.core.windows.net/tables/demoTab3"}},{"@type":"type.googleapis.com/google.rpc.RequestInfo","request_id":"dafd42ef-1dd1-4225-b41e-ac44be07e6d5","serving_data":""}]}
at io.unitycatalog.client.api.TemporaryCredentialsApi.getApiException(TemporaryCredentialsApi.java:78)
at io.unitycatalog.client.api.TemporaryCredentialsApi.generateTemporaryPathCredentialsWithHttpInfo(TemporaryCredentialsApi.java:192)
at io.unitycatalog.client.api.TemporaryCredentialsApi.generateTemporaryPathCredentials(TemporaryCredentialsApi.java:170)
at io.unitycatalog.spark.UCSingleCatalog.createTable(UCSingleCatalog.scala:106)
at org.apache.spark.sql.execution.datasources.v2.CreateTableExec.run(CreateTableExec.scala:44)
at org.apache.spark.sql.execution.datasources.v2.V2CommandExec.result$lzycompute(V2CommandExec.scala:43)
12-02-2024 03:00 AM - edited 12-02-2024 03:04 AM
Hello @dixonantony ,
Have you followed the steps here - https://docs.databricks.com/en/data-governance/unity-catalog/access-open-api.html#external-data-acce...
Thanks!
12-02-2024 03:05 AM
Also how are you creating the table,
can you please try the below option
df.write.insertInto
, or the v2 version df.writeTo(tbl_name).append()
df.write.saveAsTable
does not fully support custom catalogs.
12-02-2024 03:15 AM - edited 12-02-2024 03:17 AM
Thanks for the reply, I was following below doc and select table is working, issue is with create table.
spark.sql(f"create table if not exists {catalog}.{databasename}.testTable(id INTEGER, name VARCHAR(10), age INTEGER)").show()
Same issue is listed in https://github.com/unitycatalog/unitycatalog/issues/560 but not sure what exactly is the SAFE enabling.
12-02-2024 07:21 PM
I see thanks for sharing, for the safe flag - You can work with your Databricks Point of Contact to get help on enabling the SAFE flag for the workspace.
12-02-2024 07:24 PM
Can you also try this validate on the privileges please
Grant CREATE EXTERNAL TABLE
on the external location to create a delta table. on UNITY_CATALOG_EXTERNAL_GENERATE_PATH_CREDENTIALS_DENIED
because that privilege is blocked behind.
12-02-2024 07:40 PM
I already tried CREATE EXTERNAL TABLE , but not working.
2 weeks ago
There also is this blog https://community.databricks.com/t5/technical-blog/integrating-apache-spark-with-databricks-unity-ca... that came up while reviewing your issue. All of this is in public preview.
2 weeks ago
You need the generateTemporaryPathCrede
API as you are trying to create external tables
Join a Regional User Group to connect with local Databricks users. Events will be happening in your city, and you won’t want to miss the chance to attend and share knowledge.
If there isn’t a group near you, start one and help create a community that brings people together.
Request a New Group