cancel
Showing results for 
Search instead for 
Did you mean: 
Generative AI
Explore discussions on generative artificial intelligence techniques and applications within the Databricks Community. Share ideas, challenges, and breakthroughs in this cutting-edge field.
cancel
Showing results for 
Search instead for 
Did you mean: 

Inference table not working for Gemma 3 12b

damselfly20
New Contributor III

Hi, I have a problem regarding the inference table for Gemma 3 12b. If I create a serving endpoint for the model (from system.ai.gemma-3-12b-it) with an inference table, the inference table is being created, but it always stays empty. No matter how many requests I make and from where (API, Playground, Query endpoint window). Also, for each of the requests, I got an actual response.

I'm aware that the entries to the inference table might pop up slightly delayed, but even after hours, nothing happens. I have also tried to delete and re-create endpoint and inference table multiple times with no success.

These are my settings when creating the endpoint:

create_gemma_serving_endpoint.png

 

1 REPLY 1

Yogesh_378691
Contributor

The endpoint is working since you’re receiving responses, but the inference table isn’t capturing any records. This points to a logging/compatibility issue rather than request handling. Please confirm inference logging is enabled, verify permissions, and ensure you’re checking the correct table. If it still remains empty, it’s likely a platform-side limitation with Gemma 3 12b, and raising a Databricks support ticket would be the next step.

Yogesh Verma

Join Us as a Local Community Builder!

Passionate about hosting events and connecting people? Help us grow a vibrant local community—sign up today to get started!

Sign Up Now