Hello,
I'm encountering an issue while running a notebook that utilizes the Pulp library. The library is installed in the first cell of the notebook. Occasionally, I encounter the following error:
org.apache.spark.SparkException: Job aborted due to stage failure: Task 92 in stage 51.0 failed 4 times, most recent failure: Lost task 92.3 in stage 51.0 (TID 4465) (10.153.242.115 executor 4): org.apache.spark.SparkException: Task failed while writing rows.
During handling of the above exception, another exception occurred: pyspark.serializers.SerializationError: Caused by Traceback (most recent call last): File "/databricks/spark/python/pyspark/serializers.py", line 188, in _read_with_length return self.loads(obj) File "/databricks/spark/python/pyspark/serializers.py", line 540, in loads return cloudpickle.loads(obj, encoding=encoding) ModuleNotFoundError: No module named 'pulp'
What's puzzling is that rerunning the code often succeeds. Could anyone provide insight into why this intermittent issue might be occurring?
Thanks.