Problem with sparkContext.parallelize and volatile functions?
I have a code:from time import sleep from random import random from operator import add def f(a: int) -> float: sleep(0.1) return random() rdd1 = sc.parallelize(range(20), 2) rdd2 = sc.parallelize(range(20), 2) rdd3 = sc.parallelize(rang...
- 1186 Views
- 0 replies
- 0 kudos