<?xml version="1.0" encoding="UTF-8"?>
<rss xmlns:content="http://purl.org/rss/1.0/modules/content/" xmlns:dc="http://purl.org/dc/elements/1.1/" xmlns:rdf="http://www.w3.org/1999/02/22-rdf-syntax-ns#" xmlns:taxo="http://purl.org/rss/1.0/modules/taxonomy/" version="2.0">
  <channel>
    <title>topic Tracing through model serving endpoint in Generative AI</title>
    <link>https://community.databricks.com/t5/generative-ai/tracing-through-model-serving-endpoint/m-p/145350#M1582</link>
    <description>&lt;P&gt;&amp;nbsp;i have deployed a code running on langgraph through model serving endpoint. I want to trace the logs using ml flow and i want to trace logs in the experiment whenever a user hits the serving endpoint. I have defined both of them in my code&lt;/P&gt;&lt;DIV&gt;&lt;DIV&gt;&lt;SPAN&gt;mlflow.&lt;/SPAN&gt;&lt;SPAN&gt;set_experiment&lt;/SPAN&gt;&lt;SPAN&gt;(&lt;/SPAN&gt;&lt;SPAN&gt;"/xxx"&lt;/SPAN&gt;&lt;SPAN&gt;)&lt;/SPAN&gt;&lt;/DIV&gt;&lt;DIV&gt;&lt;SPAN&gt;mlflow.openai.&lt;/SPAN&gt;&lt;SPAN&gt;autolog&lt;/SPAN&gt;&lt;SPAN&gt;(&lt;/SPAN&gt;&lt;/DIV&gt;&lt;DIV&gt;&lt;SPAN&gt;&amp;nbsp; &amp;nbsp; &lt;/SPAN&gt;&lt;SPAN&gt;disable&lt;/SPAN&gt;&lt;SPAN&gt;=&lt;/SPAN&gt;&lt;SPAN&gt;False&lt;/SPAN&gt;&lt;SPAN&gt;,&lt;/SPAN&gt;&lt;/DIV&gt;&lt;DIV&gt;&lt;SPAN&gt;)&lt;/SPAN&gt;&lt;/DIV&gt;&lt;DIV&gt;&amp;nbsp;&lt;/DIV&gt;&lt;DIV&gt;&lt;SPAN&gt;&lt;SPAN&gt;and also set&amp;nbsp;&lt;/SPAN&gt;&lt;/SPAN&gt;&lt;DIV&gt;&lt;DIV&gt;&lt;SPAN&gt;"ENABLE_MLFLOW_TRACING"&lt;/SPAN&gt;&lt;SPAN&gt;: &lt;/SPAN&gt;&lt;SPAN&gt;"true"&lt;/SPAN&gt;&lt;SPAN&gt;,&lt;/SPAN&gt;&lt;/DIV&gt;&lt;DIV&gt;&lt;SPAN&gt;but still couldn't able to see logs in experiment&lt;/SPAN&gt;&lt;/DIV&gt;&lt;/DIV&gt;&lt;/DIV&gt;&lt;/DIV&gt;</description>
    <pubDate>Tue, 27 Jan 2026 07:10:33 GMT</pubDate>
    <dc:creator>srijan1881</dc:creator>
    <dc:date>2026-01-27T07:10:33Z</dc:date>
    <item>
      <title>Tracing through model serving endpoint</title>
      <link>https://community.databricks.com/t5/generative-ai/tracing-through-model-serving-endpoint/m-p/145350#M1582</link>
      <description>&lt;P&gt;&amp;nbsp;i have deployed a code running on langgraph through model serving endpoint. I want to trace the logs using ml flow and i want to trace logs in the experiment whenever a user hits the serving endpoint. I have defined both of them in my code&lt;/P&gt;&lt;DIV&gt;&lt;DIV&gt;&lt;SPAN&gt;mlflow.&lt;/SPAN&gt;&lt;SPAN&gt;set_experiment&lt;/SPAN&gt;&lt;SPAN&gt;(&lt;/SPAN&gt;&lt;SPAN&gt;"/xxx"&lt;/SPAN&gt;&lt;SPAN&gt;)&lt;/SPAN&gt;&lt;/DIV&gt;&lt;DIV&gt;&lt;SPAN&gt;mlflow.openai.&lt;/SPAN&gt;&lt;SPAN&gt;autolog&lt;/SPAN&gt;&lt;SPAN&gt;(&lt;/SPAN&gt;&lt;/DIV&gt;&lt;DIV&gt;&lt;SPAN&gt;&amp;nbsp; &amp;nbsp; &lt;/SPAN&gt;&lt;SPAN&gt;disable&lt;/SPAN&gt;&lt;SPAN&gt;=&lt;/SPAN&gt;&lt;SPAN&gt;False&lt;/SPAN&gt;&lt;SPAN&gt;,&lt;/SPAN&gt;&lt;/DIV&gt;&lt;DIV&gt;&lt;SPAN&gt;)&lt;/SPAN&gt;&lt;/DIV&gt;&lt;DIV&gt;&amp;nbsp;&lt;/DIV&gt;&lt;DIV&gt;&lt;SPAN&gt;&lt;SPAN&gt;and also set&amp;nbsp;&lt;/SPAN&gt;&lt;/SPAN&gt;&lt;DIV&gt;&lt;DIV&gt;&lt;SPAN&gt;"ENABLE_MLFLOW_TRACING"&lt;/SPAN&gt;&lt;SPAN&gt;: &lt;/SPAN&gt;&lt;SPAN&gt;"true"&lt;/SPAN&gt;&lt;SPAN&gt;,&lt;/SPAN&gt;&lt;/DIV&gt;&lt;DIV&gt;&lt;SPAN&gt;but still couldn't able to see logs in experiment&lt;/SPAN&gt;&lt;/DIV&gt;&lt;/DIV&gt;&lt;/DIV&gt;&lt;/DIV&gt;</description>
      <pubDate>Tue, 27 Jan 2026 07:10:33 GMT</pubDate>
      <guid>https://community.databricks.com/t5/generative-ai/tracing-through-model-serving-endpoint/m-p/145350#M1582</guid>
      <dc:creator>srijan1881</dc:creator>
      <dc:date>2026-01-27T07:10:33Z</dc:date>
    </item>
    <item>
      <title>Re: Tracing through model serving endpoint</title>
      <link>https://community.databricks.com/t5/generative-ai/tracing-through-model-serving-endpoint/m-p/145540#M1587</link>
      <description>&lt;P&gt;Hi&amp;nbsp;&lt;a href="https://community.databricks.com/t5/user/viewprofilepage/user-id/212320"&gt;@srijan1881&lt;/a&gt;&amp;nbsp;,&lt;BR /&gt;what do you mean by logs here? If you meant tracing step by step invocations etc in the model serving side. You need to add these environment variables to the served model (Serving &amp;gt; your endpoint &amp;gt; Edit endpoint &amp;gt; Environment variables), then restart the endpoint:&lt;/P&gt;
&lt;UL class="p8i6j08 p8i6j02"&gt;
&lt;LI class="p8i6j0a"&gt;ENABLE_MLFLOW_TRACING=true&lt;/LI&gt;
&lt;LI class="p8i6j0a"&gt;MLFLOW_EXPERIMENT_ID=&amp;lt;the numeric Experiment ID, not the path&amp;gt;&lt;/LI&gt;
&lt;LI class="p8i6j0a"&gt;Auth for the endpoint to write to the experiment:&lt;BR /&gt;Either DATABRICKS_HOST and DATABRICKS_TOKEN (PAT), or DATABRICKS_CLIENT_ID and DATABRICKS_CLIENT_SECRET (Service Principal). The identity must have CAN_EDIT on the target experiment.&lt;/LI&gt;
&lt;/UL&gt;
&lt;P&gt;Ref Doc -&amp;nbsp;&lt;A href="https://docs.databricks.com/aws/en/mlflow3/genai/tracing/prod-tracing" target="_blank"&gt;https://docs.databricks.com/aws/en/mlflow3/genai/tracing/prod-tracing&lt;/A&gt;&lt;/P&gt;
&lt;P&gt;Also ensure that&amp;nbsp;mlflow[databricks] &amp;gt;= 3.1&lt;/P&gt;</description>
      <pubDate>Wed, 28 Jan 2026 14:12:13 GMT</pubDate>
      <guid>https://community.databricks.com/t5/generative-ai/tracing-through-model-serving-endpoint/m-p/145540#M1587</guid>
      <dc:creator>iyashk-DB</dc:creator>
      <dc:date>2026-01-28T14:12:13Z</dc:date>
    </item>
    <item>
      <title>Re: Tracing through model serving endpoint</title>
      <link>https://community.databricks.com/t5/generative-ai/tracing-through-model-serving-endpoint/m-p/150154#M1655</link>
      <description>&lt;P&gt;Hi &lt;a href="https://community.databricks.com/t5/user/viewprofilepage/user-id/212320"&gt;@srijan1881&lt;/a&gt;,&lt;/P&gt;
&lt;P&gt;The behavior you are seeing is expected when using a manually created model serving endpoint rather than one deployed through the Databricks Agent Framework. Here is a breakdown of why traces are not appearing and how to resolve it.&lt;/P&gt;
&lt;P&gt;UNDERSTANDING THE ISSUE&lt;/P&gt;
&lt;P&gt;When you deploy code to a model serving endpoint directly (not via the Agent Framework), setting mlflow.set_experiment() and mlflow.openai.autolog() in your model code does not automatically result in traces being written to that experiment. The model serving container environment has restrictions on which MLflow operations can write back to the tracking server, and the ENABLE_MLFLOW_TRACING environment variable alone is not sufficient to enable full experiment-level trace logging for custom-deployed endpoints.&lt;/P&gt;
&lt;P&gt;RECOMMENDED APPROACH: USE THE AGENT FRAMEWORK&lt;/P&gt;
&lt;P&gt;The supported way to get real-time MLflow tracing from a model serving endpoint is to deploy your LangGraph agent using the Databricks Agent Framework. This approach automatically configures tracing so that all interactions are logged to an MLflow experiment in real time.&lt;/P&gt;
&lt;P&gt;1. Wrap your LangGraph agent as an MLflow model&lt;/P&gt;
&lt;P&gt;Make sure your agent conforms to the MLflow ChatModel or ResponsesAgent interface. For a LangGraph-based agent, you can wrap it using mlflow.pyfunc.PythonModel or the newer mlflow.models.ChatModel / ResponsesAgent interface. Then log it to Unity Catalog:&lt;/P&gt;
&lt;P&gt;import mlflow&lt;/P&gt;
&lt;P&gt;mlflow.set_registry_uri("databricks-uc")&lt;/P&gt;
&lt;P&gt;with mlflow.start_run():&lt;BR /&gt;model_info = mlflow.langchain.log_model(&lt;BR /&gt;lc_model="/path/to/your/langgraph/agent",&lt;BR /&gt;artifact_path="langgraph_agent",&lt;BR /&gt;registered_model_name="catalog.schema.your_agent_model"&lt;BR /&gt;)&lt;/P&gt;
&lt;P&gt;2. Deploy using agents.deploy()&lt;/P&gt;
&lt;P&gt;Install the required packages:&lt;/P&gt;
&lt;P&gt;%pip install mlflow&amp;gt;=3.1.3 databricks-agents&amp;gt;=1.1.0&lt;BR /&gt;dbutils.library.restartPython()&lt;/P&gt;
&lt;P&gt;Then deploy:&lt;/P&gt;
&lt;P&gt;import mlflow&lt;BR /&gt;from databricks import agents&lt;/P&gt;
&lt;P&gt;mlflow.set_experiment("/Users/your_email/your_experiment_name")&lt;/P&gt;
&lt;P&gt;deployment = agents.deploy(&lt;BR /&gt;"catalog.schema.your_agent_model",&lt;BR /&gt;model_version=1&lt;BR /&gt;)&lt;/P&gt;
&lt;P&gt;3. Verify tracing&lt;/P&gt;
&lt;P&gt;After deployment (which can take up to 15 minutes), send a request to the endpoint. Traces should appear in the MLflow experiment you specified via mlflow.set_experiment() before calling agents.deploy(). They will also be written to AI Gateway inference tables automatically for long-term retention.&lt;/P&gt;
&lt;P&gt;IMPORTANT NOTES&lt;/P&gt;
&lt;P&gt;- Set the experiment before calling agents.deploy(), not inside your model code. The experiment must be set in the notebook or script that calls deploy().&lt;/P&gt;
&lt;P&gt;- If you are deploying from a notebook inside a Databricks Git folder, real-time tracing will not work by default. You need to set the experiment to a path that is not associated with a Git folder before calling agents.deploy(). For example:&lt;/P&gt;
&lt;P&gt;mlflow.set_experiment("/Users/your_email/tracing_experiment")&lt;/P&gt;
&lt;P&gt;- All agents sharing the same endpoint will write traces to the same experiment.&lt;/P&gt;
&lt;P&gt;- Traces are also written to inference tables automatically. You can find these in the Unity Catalog under the schema associated with your endpoint.&lt;/P&gt;
&lt;P&gt;IF YOU MUST USE A CUSTOM ENDPOINT (NOT AGENT FRAMEWORK)&lt;/P&gt;
&lt;P&gt;If you need to keep your current custom model serving setup, traces from the serving container are captured in inference tables rather than in an MLflow experiment. You can:&lt;/P&gt;
&lt;P&gt;1. Enable inference tables on your endpoint through the serving endpoint configuration UI or API.&lt;/P&gt;
&lt;P&gt;2. Query the inference table to view request/response logs as a Delta table in Unity Catalog.&lt;/P&gt;
&lt;P&gt;3. For full MLflow experiment-level tracing with custom endpoints, consider adding manual trace logging in your model's predict() method using the mlflow-tracing lightweight package and sending traces asynchronously. However, the Agent Framework path is the most straightforward and fully supported approach.&lt;/P&gt;
&lt;P&gt;DOCUMENTATION REFERENCES&lt;/P&gt;
&lt;P&gt;- MLflow Tracing overview: &lt;A href="https://docs.databricks.com/aws/en/mlflow/mlflow-tracing.html" target="_blank"&gt;https://docs.databricks.com/aws/en/mlflow/mlflow-tracing.html&lt;/A&gt;&lt;BR /&gt;- Deploy agents with Agent Framework: &lt;A href="https://docs.databricks.com/aws/en/generative-ai/agent-framework/deploy-agent.html" target="_blank"&gt;https://docs.databricks.com/aws/en/generative-ai/agent-framework/deploy-agent.html&lt;/A&gt;&lt;BR /&gt;- Inference tables for model serving: &lt;A href="https://docs.databricks.com/aws/en/machine-learning/model-serving/inference-tables.html" target="_blank"&gt;https://docs.databricks.com/aws/en/machine-learning/model-serving/inference-tables.html&lt;/A&gt;&lt;BR /&gt;- Environment variables for model serving: &lt;A href="https://docs.databricks.com/aws/en/machine-learning/model-serving/store-env-variable-model-serving.html" target="_blank"&gt;https://docs.databricks.com/aws/en/machine-learning/model-serving/store-env-variable-model-serving.html&lt;/A&gt;&lt;BR /&gt;- MLflow Tracing instrumentation: &lt;A href="https://docs.databricks.com/aws/en/mlflow3/genai/tracing/app-instrumentation/" target="_blank"&gt;https://docs.databricks.com/aws/en/mlflow3/genai/tracing/app-instrumentation/&lt;/A&gt;&lt;/P&gt;
&lt;P&gt;* This reply used an agent system I built to research and draft this response based on the wide set of documentation I have available and previous memory. I personally review the draft for any obvious issues and for monitoring system reliability and update it when I detect any drift, but there is still a small chance that something is inaccurate, especially if you are experimenting with brand new features.&lt;/P&gt;</description>
      <pubDate>Sun, 08 Mar 2026 05:25:50 GMT</pubDate>
      <guid>https://community.databricks.com/t5/generative-ai/tracing-through-model-serving-endpoint/m-p/150154#M1655</guid>
      <dc:creator>SteveOstrowski</dc:creator>
      <dc:date>2026-03-08T05:25:50Z</dc:date>
    </item>
    <item>
      <title>Re: Tracing through model serving endpoint</title>
      <link>https://community.databricks.com/t5/generative-ai/tracing-through-model-serving-endpoint/m-p/153738#M1742</link>
      <description>&lt;P&gt;Hi &lt;a href="https://community.databricks.com/t5/user/viewprofilepage/user-id/112558"&gt;@iyashk-DB&lt;/a&gt;, I referred to your other responses / clarifications in the community posts whilst looking for a solution.&lt;BR /&gt;Are you Yashwanth Kiran from Amrita Vishwa Vidyapeetham university?&lt;/P&gt;</description>
      <pubDate>Wed, 08 Apr 2026 12:04:21 GMT</pubDate>
      <guid>https://community.databricks.com/t5/generative-ai/tracing-through-model-serving-endpoint/m-p/153738#M1742</guid>
      <dc:creator>ImRaNM-001</dc:creator>
      <dc:date>2026-04-08T12:04:21Z</dc:date>
    </item>
  </channel>
</rss>

