cancel
Showing results forย 
Search instead forย 
Did you mean:ย 
Data Engineering
Join discussions on data engineering best practices, architectures, and optimization strategies within the Databricks Community. Exchange insights and solutions with fellow data engineers.
cancel
Showing results forย 
Search instead forย 
Did you mean:ย 

AI_QUERY does not accept modelParameters argument

pemidexx
New Contributor II
I am trying to pass a column of data from python/pandas to Spark, then run AI_QUERY. However, when I attempt to pass modelParameters (such as temperature), the function fails. Below is a minimal example:
 
import pandas as pd

queries = pd.DataFrame([
{"request": """{"messages": [{"role": "system", "content": "You are a helpful AI assistant."}, {"role": "user", "content": "Write a short haiku about coffee."}]}"""}
])

# Convert Pandas DataFrame to Spark DataFrame
queries_spark = spark.createDataFrame(queries)

# Create or replace a temporary view
queries_spark.createOrReplaceTempView("queries_view")

model = 'openai-gpt-4o-mini'
temp = 0.2

# Execute the SQL query
spark.sql(f"""
CREATE OR REPLACE TEMP VIEW responses_view AS
SELECT
request,
AI_QUERY(
endpoint => '{model}',
request => request,
returnType => 'STRING',
modelParameters => named_struct(
'temperature', {temp}
)
) as response
FROM queries_view
""")

# Load the data back into Python
responses_df = spark.table("responses_view")
display(responses_df)

This code results in the following error:

[UNRECOGNIZED_PARAMETER_NAME] Cannot invoke function `ai_query` because the function call included a named argument reference for the argument named `modelParameters`, but this function does not include any signature containing an argument with this name. Did you mean one of the following? [`returnType` `endpoint` `request`]. SQLSTATE: 4274K
File <command-384165507682497>, line 32  16 spark.sql(f"""  17  CREATE OR REPLACE TEMP VIEW responses_view AS  18  SELECT  (...)  28  FROM queries_view  29 """)  31 # Load the data back into Python ---> 32 responses_df = spark.table("responses_view")  33 display(responses_df)
2 REPLIES 2

Walter_C
Databricks Employee
Databricks Employee

Can you confirm if your modelParams follows the requirements mentioned here:

  • modelParameters (optional): A struct field that contains chat, completion and embedding model parameters for serving foundation models or external models. These model parameters must be constant parameters and not data dependent. When these model parameters are not specified or set to null the default value is used. With the exception of temperature which has a default value of 0.0, the default values for these model parameters are the same as those listed in Foundation model REST API reference.

pemidexx
New Contributor II

Hi @Walter_C , yes, I am receiving this error when only attempting to set temperature, which should be supported on most if not all models, including the specific models I'm working with. The error message seems to indicate this is a problem with AI_QUERY, not the downstream model.

Connect with Databricks Users in Your Area

Join a Regional User Group to connect with local Databricks users. Events will be happening in your city, and you wonโ€™t want to miss the chance to attend and share knowledge.

If there isnโ€™t a group near you, start one and help create a community that brings people together.

Request a New Group