cancel
Showing results forĀ 
Search instead forĀ 
Did you mean:Ā 
Databricks Free Trial Help
Engage in discussions about the Databricks Free Trial within the Databricks Community. Share insights, tips, and best practices for getting started, troubleshooting issues, and maximizing the value of your trial experience to explore Databricks' capabilities effectively.
cancel
Showing results forĀ 
Search instead forĀ 
Did you mean:Ā 

Generate a temporary table credential fails

RajeshRK
Contributor II

Hi Team,

I am trying to execute the below API, and it is failing.

API:

curl -v  -X POST "https://dbc-xxxxxxxx-xxxx.cloud.databricks.com/api/2.0/unity-catalog/temporary-stage-credentials" -H "Authentication: Bearer xxxxxxxxxxxxxxxx" -d '{"table_id":"external.externalschema.lab_2010","operation_name": "READ"}'

RESPONSE:

{"error_code":"INVALID_PARAMETER_VALUE","message":"GenerateTemporaryStageCredential Missing required field: staging_url","details":[{"@type":"type.googleapis.com/google.rpc.ErrorInfo","reason":"INVALID_FIELD","domain":"unity-catalog.databricks.com","metadata":{"field_name":"staging_url"}},{"@type":"type.googleapis.com/google.rpc.RequestInfo","request_id":"xxxxxx-1a3e-467f-88fc-44c6db8b20e9","serving_data":""}]}

 

The API is complaining that "staging_url" is required one, but in the API documentation I don't see the "staging_url" is required.   Here is the documentation: 

https://docs.databricks.com/api/workspace/temporarytablecredentials/generatetemporarytablecredential...

Please advise how to proceed with this.

 

Regards,

Rajesh.

2 ACCEPTED SOLUTIONS

Accepted Solutions

Walter_C
Databricks Employee
Databricks Employee

Your endpoint also seems to be incorrect, as per doc https://docs.databricks.com/api/gcp/workspace/temporarytablecredentials/generatetemporarytablecreden... it has to be /api/2.0/unity-catalog/temporary-table-credentials and you are using  /api/2.0/unity-catalog/temporary-stage-credentials

View solution in original post

Wow, it worked.  Thank you so much!

View solution in original post

11 REPLIES 11

Walter_C
Databricks Employee
Databricks Employee

Hello, many thanks for the question I have tried the same API call and it has worked as expected, it seems that the id of the table you are passing is incorrect, to get the proper id, please look for the table in the Catalog explorer and then go to the tab details there you will find the table id.

Please make sure that the metastore is enabled with the feature External data access and that the user has EXTERNAL USE SCHEMA permission on the schema where the table was created.

Hi Walter,

I have checked the required permission, and external data access is enabled.  Here are the catalog and metastore details:

CATALOG:

{
      "name": "external",
      "owner": ā€œXXXXXXXX@xxxxxxx.com",
      "storage_root": "s3://xxxxx-xxxxx-test/",
      "catalog_type": "MANAGED_CATALOG",
      "metastore_id": ā€œxxxxxxx-xxxx-441c-8548-f15054e97666",
      "created_at": 1734688660503,
      "created_by": ā€œxxxxxxxx@xxxxxx.com",
      "updated_at": 1734688660503,
      "updated_by": ā€œxxxxxx.xxxxxx@xxxxxx.com",
      "storage_location": "s3://xxxxx-xxxxx-xxxxx/__unitystorage/catalogs/xxxxxx-xxxx-4a50-9f8f-84fa64a4be8b",
      "isolation_mode": "OPEN",
      "accessible_in_current_workspace": true,
      "browse_only": false,
      "id": ā€œxxxxxxxx-2c9c-4a50-9f8f-xxxxxxxxxxā€,
      "full_name": "external",
      "securable_type": "CATALOG",
      "securable_kind": "CATALOG_STANDARD",
      "resource_name": "/metastores/xxxxxxxx-334c-441c-8548-xxxxxxx666/catalogs/xxxxxxxx-2c9c-4a50-9f8f-xxxxxxe8b"
    },

METASTORE:

{
  "metastores": [
    {
      "name": ā€œxxxxxxxx_aws_us_west_2",
      "default_data_access_config_id": ā€œxxxxxxx-4d26-4f21-xxxxx-14008c6b4696",
      "storage_root_credential_id": "xxxxxxx-4d26-4f21-xxxxx-14008c6b4696",
      "delta_sharing_scope": "INTERNAL",
      "owner": "System user",
      "privilege_model_version": "1.0",
      "region": "us-west-2",
      "metastore_id": "xxxxxxx-xxxx-441c-8548-f15054e97666",
      "metastore_account_id": ā€œxxxxxxx-a8aa-xxxxx-998a-xxxxxxxxxā€,
      "created_at": 1734531659103,
      "created_by": "System user",
      "updated_at": 1734708821326,
      "updated_by": ā€œxxxxxxx.xxxxxx@xxxxxx.com",
      "storage_root_credential_name": ā€œxxxxx_s3_credentials_xxxxxxxx-xxxxā€,
      "cloud": "aws",
      "global_metastore_id": "aws:us-west-2:xxxxxxxx-xxxx-551c-9558-f25056e98696",
      "full_name": "xxxxxxxx_aws_us_west_2",
      "securable_type": "METASTORE",
      "securable_kind": "METASTORE_STANDARD",
      "predictive_optimization_enabled": false,
      "external_access_enabled": true
    }
  ]
}

The user has "external use schema" privilege.

Regards,

Rajesh.

Walter_C
Databricks Employee
Databricks Employee

Got it, can you confirm if the table id you are using is correct? This is because in your API you mention this:

'{"table_id":"external.externalschema.lab_2010","operation_name": "READ"}'

On the table id it seems you are using the name of the table which is not correct, you need to go to the details of the table and you will find the table id that should be provided.

yes, I corrected it

curl -v  -X POST "https://dbc-xxxxx-xxxx.cloud.databricks.com/api/2.0/unity-catalog/temporary-stage-credentials" -H "Authentication: Bearer xxxxxxxxxxxxx" -d '{"table_id":"xxxxx-c4b4-4827-a834-xxxxxxx","operation_name": "READ"}'

Walter_C
Databricks Employee
Databricks Employee

You still receiving same exact error or a different one?

If the same error try to run it this way:

curl --location 'https://<workspace_url>/api/2.0/unity-catalog/temporary-table-credentials' \
--header 'Content-Type: application/json' \
--header 'Authorization: Bearer ***********' \
--data '{
"table_id": "xxxx-c4b4-4827-a834-xxxxxxx",
"operation": "READ"
}'

 

I am receiving the same error.

RajeshRK
Contributor II

curl --location 'https://dbc-xxxxxxx-xxxx.cloud.databricks.com/api/2.0/unity-catalog/temporary-stage-credentials' \

> --header 'Content-type: application/json' \

> --header 'Authentication: Bearer xxxxxxxxxxxxxxxxxxx' \

> --data '{"table_id":"xxxx-c4b4-4827-a834-xxxxxxx","operation_name": "READ"}'

{"error_code":"INVALID_PARAMETER_VALUE","message":"GenerateTemporaryStageCredential Missing required field: staging_url","details":[{"@type":"type.googleapis.com/google.rpc.ErrorInfo","reason":"INVALID_FIELD","domain":"unity-catalog.databricks.com","metadata":{"field_name":"staging_url"}},{"@type":"type.googleapis.com/google.rpc.RequestInfo","request_id":"f289990f-a5f7-40e8-8cea-2b7cc25c353e","serving_data":""}]}%

 

 

 

 

 

 

 

 

 

 

Walter_C
Databricks Employee
Databricks Employee

Can you try this:

curl --location 'https://<workspace_url>/api/2.0/unity-catalog/temporary-table-credentials' \
--header 'Content-Type: application/json' \
--header 'Authorization: Bearer ***********' \
--data '{
"table_id": "xxxx-c4b4-4827-a834-xxxxxxx",
"operation": "READ"
}'

Seems the field is called operation and not operation_name 

 

same error:

% curl --location 'https://dbc-xxxxxx-xxxx.cloud.databricks.com/api/2.0/unity-catalog/temporary-stage-credentials' \

--header 'Content-type: application/json' \

--header 'Authentication: Bearer xxxxxxxxxxxxxxxxxxxxxxxx' \

--data '{"table_id":"xxxxxx-xxxxx-4827-a834-xxxxxxxxx","operation": "READ"}'

{"error_code":"INVALID_PARAMETER_VALUE","message":"GenerateTemporaryStageCredential Missing required field: staging_url","details":[{"@type":"type.googleapis.com/google.rpc.ErrorInfo","reason":"INVALID_FIELD","domain":"unity-catalog.databricks.com","metadata":{"field_name":"staging_url"}},{"@type":"type.googleapis.com/google.rpc.RequestInfo","request_id":"8a541bb5-ee18-45ea-aedb-0f03eb72dcf0","serving_data":""}]}%

Walter_C
Databricks Employee
Databricks Employee

Your endpoint also seems to be incorrect, as per doc https://docs.databricks.com/api/gcp/workspace/temporarytablecredentials/generatetemporarytablecreden... it has to be /api/2.0/unity-catalog/temporary-table-credentials and you are using  /api/2.0/unity-catalog/temporary-stage-credentials

Wow, it worked.  Thank you so much!

Connect with Databricks Users in Your Area

Join a Regional User Group to connect with local Databricks users. Events will be happening in your city, and you wonā€™t want to miss the chance to attend and share knowledge.

If there isnā€™t a group near you, start one and help create a community that brings people together.

Request a New Group