โ01-28-2024 11:24 AM
Hi,
I have pyspark dataframe and pyspark udf which calls mlflow model for each row but its performance is too slow.
Here is sample code
def myfunc(input_text):
restult = mlflowmodel.predict(input_text)
return result
myfuncUDF = udf(myfunc,StringType())
df = spark.sql("select * from test")
df=df.withColumn("test_result",myfuncUDF("input_text"))
Please suggest how to improve the performance.
Regards,
Sanjay
โ01-31-2024 04:28 AM
Thank you Kaniz for the suggestions. This is really helpful. I even tried using applyInPandas. Not sure if this is better than spark UDF. If not can you help me in converting this function to pandas udf or any other optimized function.
โ02-06-2024 11:36 PM
Thank you @Kaniz_Fatma, its really helpful and did worked. Another quick question, I have to pass 2 parameters as input to myfunc. Please help how to pass multiple parameters.
def myfunc(input_text, param2):
# Assuming mlflowmodel is defined elsewhere
result = mlflowmodel.predict(input_text, param2)
return result
# Create a Pandas UDF
@pandas_udf(StringType())
def myfunc_udf(input_text_series: pd.Series, param2_series: pd.Series) -> pd.Series:
return input_text_series.apply(myfunc) ??
โ02-07-2024 05:05 AM
Hi Kaniz,
I started getting following error after using myfunc_udf with 2 parameters.
pythonException: 'ValueError: The truth value of a Series is ambiguous. Use a.empty, a.bool(), a.item(), a.any() or a.all()
Regards,
Sanjay
โ02-14-2024 06:42 AM
โ02-07-2024 01:40 AM
I need to send two arguments to myfunc, thus I have another brief question. I need some guidance on how to pass in many parameters.
โ02-07-2024 06:12 AM
Hello Sanjay,
Could you please share your code snippet as per latest changes?
โ02-07-2024 07:26 AM - edited โ02-07-2024 07:39 AM
pythonException: 'ValueError: The truth value of a Series is ambiguous. Use a.empty, a.bool(), a.item(), a.any() or a.all()
โ02-07-2024 09:25 PM
Hello Sanjay,
The above code don't have the df defined. Can you share your df.show() output.
โ03-18-2024 07:19 PM
So good
Join a Regional User Group to connect with local Databricks users. Events will be happening in your city, and you wonโt want to miss the chance to attend and share knowledge.
If there isnโt a group near you, start one and help create a community that brings people together.
Request a New Group