cancel
Showing results forย 
Search instead forย 
Did you mean:ย 
Data Engineering
cancel
Showing results forย 
Search instead forย 
Did you mean:ย 

ModuleNotFoundError: No module named 'pulp'

YS1
New Contributor III

Hello,

I'm encountering an issue while running a notebook that utilizes the Pulp library. The library is installed in the first cell of the notebook. Occasionally, I encounter the following error:

 

 

org.apache.spark.SparkException: Job aborted due to stage failure: Task 92 in stage 51.0 failed 4 times, most recent failure: Lost task 92.3 in stage 51.0 (TID 4465) (10.153.242.115 executor 4): org.apache.spark.SparkException: Task failed while writing rows.

During handling of the above exception, another exception occurred: pyspark.serializers.SerializationError: Caused by Traceback (most recent call last): File "/databricks/spark/python/pyspark/serializers.py", line 188, in _read_with_length return self.loads(obj) File "/databricks/spark/python/pyspark/serializers.py", line 540, in loads return cloudpickle.loads(obj, encoding=encoding) ModuleNotFoundError: No module named 'pulp'

 

 

What's puzzling is that rerunning the code often succeeds. Could anyone provide insight into why this intermittent issue might be occurring?

Thanks.

2 REPLIES 2

Kaniz
Community Manager
Community Manager

Hi @YS1, Ensure that Pulp is correctly installed in the notebook environment.

You can verify this by running the following command:

!pip show pulp

If Pulp is not installed, you can install it using:

!pip install pulp

YS1
New Contributor III

I've double-checked, and the Pulp library is correctly installed. However, I'm still encountering the intermittent 'No module named 'pulp'' error, which is perplexing.

Welcome to Databricks Community: Lets learn, network and celebrate together

Join our fast-growing data practitioner and expert community of 80K+ members, ready to discover, help and collaborate together while making meaningful connections. 

Click here to register and join today! 

Engage in exciting technical discussions, join a group with your peers and meet our Featured Members.