AI_QUERY does not accept modelParameters argument
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
11-22-2024 08:02 AM
import pandas as pd
queries = pd.DataFrame([
{"request": """{"messages": [{"role": "system", "content": "You are a helpful AI assistant."}, {"role": "user", "content": "Write a short haiku about coffee."}]}"""}
])
# Convert Pandas DataFrame to Spark DataFrame
queries_spark = spark.createDataFrame(queries)
# Create or replace a temporary view
queries_spark.createOrReplaceTempView("queries_view")
model = 'openai-gpt-4o-mini'
temp = 0.2
# Execute the SQL query
spark.sql(f"""
CREATE OR REPLACE TEMP VIEW responses_view AS
SELECT
request,
AI_QUERY(
endpoint => '{model}',
request => request,
returnType => 'STRING',
modelParameters => named_struct(
'temperature', {temp}
)
) as response
FROM queries_view
""")
# Load the data back into Python
responses_df = spark.table("responses_view")
display(responses_df)
This code results in the following error:
[UNRECOGNIZED_PARAMETER_NAME] Cannot invoke function `ai_query` because the function call included a named argument reference for the argument named `modelParameters`, but this function does not include any signature containing an argument with this name. Did you mean one of the following? [`returnType` `endpoint` `request`]. SQLSTATE: 4274K
File <command-384165507682497>, line 32 16 spark.sql(f""" 17 CREATE OR REPLACE TEMP VIEW responses_view AS 18 SELECT (...) 28 FROM queries_view 29 """) 31 # Load the data back into Python ---> 32 responses_df = spark.table("responses_view") 33 display(responses_df)
- Labels:
-
Spark
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
11-22-2024 11:21 AM
Can you confirm if your modelParams follows the requirements mentioned here:
-
modelParameters
(optional): A struct field that contains chat, completion and embedding model parameters for serving foundation models or external models. These model parameters must be constant parameters and not data dependent. When these model parameters are not specified or set tonull
the default value is used. With the exception oftemperature
which has a default value of 0.0, the default values for these model parameters are the same as those listed in Foundation model REST API reference.
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
11-22-2024 12:31 PM
Hi @Walter_C , yes, I am receiving this error when only attempting to set temperature, which should be supported on most if not all models, including the specific models I'm working with. The error message seems to indicate this is a problem with AI_QUERY, not the downstream model.

