cancel
Showing results for 
Search instead for 
Did you mean: 
Data Engineering
cancel
Showing results for 
Search instead for 
Did you mean: 

Job fails after runtime upgrade

NicolasEscobar
New Contributor II

I have a job running with no issues in Databricks runtime 7.3 LTS. When I upgraded to 8.3 it fails with error An exception was thrown from a UDF: 'pyspark.serializers.SerializationError'... SparkContext should only be created and accessed on the driver

In the notebook I use applyInPandas to apply a UDF to each group. In this UDF I pull data from Snowflake making use of the spark session (spark.read.format(...)) and I understand that is the reason why it fails.

My question is, why was it working in 7.3 LTS and it's not working now? What changed?

Thanks,

1 ACCEPTED SOLUTION

Accepted Solutions

User16763506586
Contributor

DBR-8.3 uses SPARK with version 3.1.x. As per migration guide by default it is restricted to use SparkContext inside the executor. You can enable it by using spark.executor.allowSparkContext

In Spark 3.0 and below, SparkContext can be created in executors. Since Spark 3.1, an exception will be thrown when creating SparkContext in executors. You can allow it by setting the configuration spark.executor.allowSparkContext when creating SparkContext in executors.

View solution in original post

8 REPLIES 8

Kaniz
Community Manager
Community Manager

Hi @ NicolasEscobar! My name is Kaniz, and I'm the technical moderator here. Great to meet you, and thanks for your question! Let's see if your peers in the community have an answer to your question first. Or else I will follow up with my team and get back to you soon. Thanks.

shan_chandra
Honored Contributor III
Honored Contributor III

@Nicolas Escobar​ - could you please share the full error stack trace ?

User16763506586
Contributor

DBR-8.3 uses SPARK with version 3.1.x. As per migration guide by default it is restricted to use SparkContext inside the executor. You can enable it by using spark.executor.allowSparkContext

In Spark 3.0 and below, SparkContext can be created in executors. Since Spark 3.1, an exception will be thrown when creating SparkContext in executors. You can allow it by setting the configuration spark.executor.allowSparkContext when creating SparkContext in executors.

sean_owen
Honored Contributor II
Honored Contributor II

To clarify a bit more - in Spark, you can never use a SparkContext or SparkSession within a task / UDF. This has always been true. If it worked before, it's because you were accidentally sending the SparkContext because it was captured in your code, but I guess you never tried to use it. It would have failed. Now it just fails earlier.

The real solution is to change your code to not accidentally hold on to the SparkContext or SparkSession in your UDF.

Thanks Sean for your answer, it's clear.

I was just wondering why the code was executing before with no errors and with the expected output but now I understand that this is because there was no restriction before and this changed after the release of Spark 3.1, as Sandeep mentioned.

Hi @Sean Owen​ Thanks for highlighting this. Could you please provide some sample code when you mention "not accidentally hold on to the SparkContext or SparkSession in your UDF". Thanks

sean_owen
Honored Contributor II
Honored Contributor II

There are 1000 ways this could happen, so not really, but they're all the same idea: you can't reference the SparkContext or SparkSession object, directly or indirectly in a UDF. Simply, you cannot use it in the UDF code.

User16873042682
New Contributor II

Adding to @Sean Owen​  comments, The only reason this is working is that the optimizer is evaluating this locally rather than creating a context on executors and evaluating it.

Welcome to Databricks Community: Lets learn, network and celebrate together

Join our fast-growing data practitioner and expert community of 80K+ members, ready to discover, help and collaborate together while making meaningful connections. 

Click here to register and join today! 

Engage in exciting technical discussions, join a group with your peers and meet our Featured Members.