Populate client_request_id in Model Serving inference table
Options
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
09-13-2024 12:59 AM
Hi,
The documentation for the model serving inference table states that the client_request_id column is typically null. How can I populate this column with a request ID from the calling .NET application when invoking the model via the Databricks REST API?
See https://docs.databricks.com/en/generative-ai/deploy-agent.html
Column nameTypeDescription
client_request_id | String | Client request ID, usually null. |
Labels:
- Labels:
-
Model Serving
1 REPLY 1

