<?xml version="1.0" encoding="UTF-8"?>
<rss xmlns:content="http://purl.org/rss/1.0/modules/content/" xmlns:dc="http://purl.org/dc/elements/1.1/" xmlns:rdf="http://www.w3.org/1999/02/22-rdf-syntax-ns#" xmlns:taxo="http://purl.org/rss/1.0/modules/taxonomy/" version="2.0">
  <channel>
    <title>topic Re: The inference table is not updated in Machine Learning</title>
    <link>https://community.databricks.com/t5/machine-learning/the-inference-table-is-not-updated/m-p/126131#M4166</link>
    <description>&lt;P class="p1"&gt;Hi Dharma,&lt;/P&gt;
&lt;P class="p1"&gt;As mentioned in the &lt;A href="https://docs.databricks.com/aws/en/machine-learning/model-serving/inference-tables#limitations" target="_blank"&gt;documentation&lt;/A&gt;, Inference table log delivery is currently best effort, but logs are usually available within 1 hour of a request.&lt;BR /&gt;Please try to query the inference tables after waiting for an hour.&lt;/P&gt;
&lt;P class="p1"&gt;&lt;BR /&gt;There are certain scenarios where the records don't show up in the inference tables. For example, consider a scenario where the request simply is not authorized with Databricks credentials—there is no way for us to log this, as we cannot admit the request into the Databricks system. This extends to rate limits and a few other cases as well. TLDR: We generally guarantee all scored requests are logged, but various issues can occur before model scoring/inference.&lt;BR /&gt;&lt;BR /&gt;I hope this help &lt;span class="lia-unicode-emoji" title=":slightly_smiling_face:"&gt;🙂&lt;/span&gt;&lt;/P&gt;</description>
    <pubDate>Wed, 23 Jul 2025 09:32:40 GMT</pubDate>
    <dc:creator>Amruth_Ashok</dc:creator>
    <dc:date>2025-07-23T09:32:40Z</dc:date>
    <item>
      <title>The inference table is not updated</title>
      <link>https://community.databricks.com/t5/machine-learning/the-inference-table-is-not-updated/m-p/124377#M4145</link>
      <description>&lt;P&gt;Hi,&amp;nbsp;&lt;BR /&gt;I am deploying a model with following code:&amp;nbsp;&lt;BR /&gt;&lt;BR /&gt;&lt;/P&gt;&lt;P class="lia-indent-padding-left-60px"&gt;w = WorkspaceClient()&lt;/P&gt;&lt;P class="lia-indent-padding-left-60px"&gt;model_cfg = {&lt;BR /&gt;"entity_name": uc_model,&lt;BR /&gt;"entity_version": str(version),&lt;BR /&gt;"workload_type": "CPU",&lt;BR /&gt;"workload_size": "Small",&lt;BR /&gt;"scale_to_zero_enabled": True&lt;BR /&gt;}&lt;/P&gt;&lt;P class="lia-indent-padding-left-60px"&gt;ai_gateway_config = AiGatewayConfig(&lt;BR /&gt;inference_table_config=AiGatewayInferenceTableConfig(&lt;BR /&gt;catalog_name=catalog,&lt;BR /&gt;schema_name=schema,&lt;BR /&gt;enabled=True&lt;BR /&gt;)&lt;BR /&gt;)&lt;/P&gt;&lt;P class="lia-indent-padding-left-60px"&gt;served_model = ServedEntityInput.from_dict(model_cfg)&lt;BR /&gt;endpoint_config = EndpointCoreConfigInput(served_entities=[served_model])&lt;/P&gt;&lt;P class="lia-indent-padding-left-60px"&gt;try:&lt;BR /&gt;w.serving_endpoints.get(endpoint_name)&lt;BR /&gt;w.serving_endpoints.update_config(&lt;BR /&gt;name=endpoint_name,&lt;BR /&gt;served_entities=[served_model]&lt;BR /&gt;)&lt;/P&gt;&lt;P class="lia-indent-padding-left-60px"&gt;except NotFound:&lt;BR /&gt;w.serving_endpoints.create(&lt;BR /&gt;name=endpoint_name,&lt;BR /&gt;ai_gateway=ai_gateway_config,&lt;BR /&gt;config=endpoint_config,&lt;BR /&gt;route_optimized=True&lt;BR /&gt;)&lt;/P&gt;&lt;P&gt;It creates the inference table, the endpoint makes predictions, but it doesn't log the request and response. Could someone point me what I am missing? Thank you.&lt;/P&gt;</description>
      <pubDate>Mon, 07 Jul 2025 22:21:28 GMT</pubDate>
      <guid>https://community.databricks.com/t5/machine-learning/the-inference-table-is-not-updated/m-p/124377#M4145</guid>
      <dc:creator>kcdharma</dc:creator>
      <dc:date>2025-07-07T22:21:28Z</dc:date>
    </item>
    <item>
      <title>Re: The inference table is not updated</title>
      <link>https://community.databricks.com/t5/machine-learning/the-inference-table-is-not-updated/m-p/126131#M4166</link>
      <description>&lt;P class="p1"&gt;Hi Dharma,&lt;/P&gt;
&lt;P class="p1"&gt;As mentioned in the &lt;A href="https://docs.databricks.com/aws/en/machine-learning/model-serving/inference-tables#limitations" target="_blank"&gt;documentation&lt;/A&gt;, Inference table log delivery is currently best effort, but logs are usually available within 1 hour of a request.&lt;BR /&gt;Please try to query the inference tables after waiting for an hour.&lt;/P&gt;
&lt;P class="p1"&gt;&lt;BR /&gt;There are certain scenarios where the records don't show up in the inference tables. For example, consider a scenario where the request simply is not authorized with Databricks credentials—there is no way for us to log this, as we cannot admit the request into the Databricks system. This extends to rate limits and a few other cases as well. TLDR: We generally guarantee all scored requests are logged, but various issues can occur before model scoring/inference.&lt;BR /&gt;&lt;BR /&gt;I hope this help &lt;span class="lia-unicode-emoji" title=":slightly_smiling_face:"&gt;🙂&lt;/span&gt;&lt;/P&gt;</description>
      <pubDate>Wed, 23 Jul 2025 09:32:40 GMT</pubDate>
      <guid>https://community.databricks.com/t5/machine-learning/the-inference-table-is-not-updated/m-p/126131#M4166</guid>
      <dc:creator>Amruth_Ashok</dc:creator>
      <dc:date>2025-07-23T09:32:40Z</dc:date>
    </item>
  </channel>
</rss>

