Hi, I have a problem regarding the inference table for Gemma 3 12b. If I create a serving endpoint for the model (from system.ai.gemma-3-12b-it) with an inference table, the inference table is being created, but it always stays empty. No matter how many requests I make and from where (API, Playground, Query endpoint window). Also, for each of the requests, I got an actual response.
I'm aware that the entries to the inference table might pop up slightly delayed, but even after hours, nothing happens. I have also tried to delete and re-create endpoint and inference table multiple times with no success.
These are my settings when creating the endpoint:
