cancel
Showing results for 
Search instead for 
Did you mean: 
Data Engineering
Join discussions on data engineering best practices, architectures, and optimization strategies within the Databricks Community. Exchange insights and solutions with fellow data engineers.
cancel
Showing results for 
Search instead for 
Did you mean: 

Not able create table form external spark

dixonantony
New Contributor III

py4j.protocol.Py4JJavaError: An error occurred while calling o123.sql.
: io.unitycatalog.client.ApiException: generateTemporaryPathCredentials call failed with: 401 - {"error_code":"UNAUTHENTICATED","message":"Request to generate access credential for path 'abfss://ext-location@databrickspocdixon.dfs.core.windows.net/tables/demoTab3' from outside of Databricks Unity Catalog enabled compute environment is denied for security. Please contact Databricks support for integrations with Unity Catalog.","details":[{"@type":"type.googleapis.com/google.rpc.ErrorInfo","reason":"UNITY_CATALOG_EXTERNAL_GENERATE_PATH_CREDENTIALS_DENIED","domain":"unity-catalog.databricks.com","metadata":{"path":"abfss://ext-location@databrickspocdixon.dfs.core.windows.net/tables/demoTab3"}},{"@type":"type.googleapis.com/google.rpc.RequestInfo","request_id":"dafd42ef-1dd1-4225-b41e-ac44be07e6d5","serving_data":""}]}
at io.unitycatalog.client.api.TemporaryCredentialsApi.getApiException(TemporaryCredentialsApi.java:78)
at io.unitycatalog.client.api.TemporaryCredentialsApi.generateTemporaryPathCredentialsWithHttpInfo(TemporaryCredentialsApi.java:192)
at io.unitycatalog.client.api.TemporaryCredentialsApi.generateTemporaryPathCredentials(TemporaryCredentialsApi.java:170)
at io.unitycatalog.spark.UCSingleCatalog.createTable(UCSingleCatalog.scala:106)
at org.apache.spark.sql.execution.datasources.v2.CreateTableExec.run(CreateTableExec.scala:44)
at org.apache.spark.sql.execution.datasources.v2.V2CommandExec.result$lzycompute(V2CommandExec.scala:43)

6 REPLIES 6

NandiniN
Databricks Employee
Databricks Employee

NandiniN
Databricks Employee
Databricks Employee

Also how are you creating the table, 

can you please try the below option

df.write.insertInto , or the v2 version df.writeTo(tbl_name).append()

df.write.saveAsTable  does not fully support custom catalogs.

dixonantony
New Contributor III

Thanks for the reply, I was following below doc and select table is working, issue is with create table.

spark.sql(f"create table if not exists {catalog}.{databasename}.testTable(id INTEGER, name VARCHAR(10), age INTEGER)").show()

https://community.databricks.com/t5/technical-blog/integrating-apache-spark-with-databricks-unity-ca...

Same issue is listed in https://github.com/unitycatalog/unitycatalog/issues/560  but not sure what exactly is the SAFE enabling.

NandiniN
Databricks Employee
Databricks Employee

I see thanks for sharing, for the safe flag - You can work with your Databricks Point of Contact to get help on enabling the SAFE flag for the workspace.

NandiniN
Databricks Employee
Databricks Employee

Can you also try this validate on the privileges please

Grant CREATE EXTERNAL TABLE on the external location to create a delta table.  on UNITY_CATALOG_EXTERNAL_GENERATE_PATH_CREDENTIALS_DENIED because that privilege is blocked behind.

dixonantony
New Contributor III

I already tried CREATE EXTERNAL TABLE , but not working.

Connect with Databricks Users in Your Area

Join a Regional User Group to connect with local Databricks users. Events will be happening in your city, and you won’t want to miss the chance to attend and share knowledge.

If there isn’t a group near you, start one and help create a community that brings people together.

Request a New Group