cancel
Showing results forย 
Search instead forย 
Did you mean:ย 
Machine Learning
Dive into the world of machine learning on the Databricks platform. Explore discussions on algorithms, model training, deployment, and more. Connect with ML enthusiasts and experts.
cancel
Showing results forย 
Search instead forย 
Did you mean:ย 

The inference table is not updated

kcdharma
New Contributor

Hi, 
I am deploying a model with following code: 

w = WorkspaceClient()

model_cfg = {
"entity_name": uc_model,
"entity_version": str(version),
"workload_type": "CPU",
"workload_size": "Small",
"scale_to_zero_enabled": True
}

ai_gateway_config = AiGatewayConfig(
inference_table_config=AiGatewayInferenceTableConfig(
catalog_name=catalog,
schema_name=schema,
enabled=True
)
)

served_model = ServedEntityInput.from_dict(model_cfg)
endpoint_config = EndpointCoreConfigInput(served_entities=[served_model])

try:
w.serving_endpoints.get(endpoint_name)
w.serving_endpoints.update_config(
name=endpoint_name,
served_entities=[served_model]
)

except NotFound:
w.serving_endpoints.create(
name=endpoint_name,
ai_gateway=ai_gateway_config,
config=endpoint_config,
route_optimized=True
)

It creates the inference table, the endpoint makes predictions, but it doesn't log the request and response. Could someone point me what I am missing? Thank you.

1 REPLY 1

Amruth_Ashok
Databricks Employee
Databricks Employee

Hi Dharma,

As mentioned in the documentation, Inference table log delivery is currently best effort, but logs are usually available within 1 hour of a request.
Please try to query the inference tables after waiting for an hour.


There are certain scenarios where the records don't show up in the inference tables. For example, consider a scenario where the request simply is not authorized with Databricks credentialsโ€”there is no way for us to log this, as we cannot admit the request into the Databricks system. This extends to rate limits and a few other cases as well. TLDR: We generally guarantee all scored requests are logged, but various issues can occur before model scoring/inference.

I hope this help ๐Ÿ™‚