cancel
Showing results forย 
Search instead forย 
Did you mean:ย 
Data Engineering
Join discussions on data engineering best practices, architectures, and optimization strategies within the Databricks Community. Exchange insights and solutions with fellow data engineers.
cancel
Showing results forย 
Search instead forย 
Did you mean:ย 

How to ensure pyspark udf execution is distributed across worker nodes

pjv
New Contributor III

Hi,

I have the following databricks notebook code defined:

 
pyspark_dataframe = create_pyspark_dataframe(some input data)
MyUDF = udf(myfunc, StringType())
pyspark_dataframe = pyspark_dataframe.withColumn('UDFOutput'DownloadUDF(input data columns))
output_strings = [x["UDFOutput"] for x in pyspark_dataframe.select("UDFOutput").collect()]
 
Im running this notebook on a cluster with multiple worker nodes. How can I ensure that the udf execution is distributed equally across the worker nodes?
 
Kind regards,
 
Pim
1 REPLY 1

VZLA
Databricks Employee
Databricks Employee

@pjv Can you please try the following, you'll basically want to have more than a single partition:

from pyspark.sql import SparkSession
from pyspark.sql.functions import udf
from pyspark.sql.types import StringType

# Initialize Spark session (if not already done)
spark = SparkSession.builder.appName("AppName").getOrCreate()

# Create a PySpark DataFrame from your input data
pyspark_dataframe = create_pyspark_dataframe(some_input_data)

# Repartition the DataFrame to ensure even distribution across worker nodes
num_partitions = 4  # Adjust based on your cluster size
pyspark_dataframe = pyspark_dataframe.repartition(num_partitions)

# Define your UDF
MyUDF = udf(myfunc, StringType())

# Apply the UDF to the DataFrame
pyspark_dataframe = pyspark_dataframe.withColumn('UDFOutput', MyUDF(input_data_columns))

# Collect the results
output_strings = [x["UDFOutput"] for x in pyspark_dataframe.select("UDFOutput").collect()]

# Confirm the distribution UDF execution.

 

Connect with Databricks Users in Your Area

Join a Regional User Group to connect with local Databricks users. Events will be happening in your city, and you wonโ€™t want to miss the chance to attend and share knowledge.

If there isnโ€™t a group near you, start one and help create a community that brings people together.

Request a New Group