by
gnosis
• New Contributor II
- 1637 Views
- 0 replies
- 1 kudos
When the notebook is run by the jobs/workflow scheduler, the data is never imported, but the files do get removed. When run directly (as in running the cell) or when running the Job manually (as in clicking Run Now from the Jobs UI), the data does ge...
- 1637 Views
- 0 replies
- 1 kudos
- 2850 Views
- 4 replies
- 2 kudos
import matplotlib.pyplot as pltimport seaborn as snsimport pandas as pdimport numpy as npprob = np.random.rand(7) + 0.1prob /= prob.sum()df = pd.DataFrame({'department': np.random.choice(['helium', 'neon', 'argon', 'krypton', 'xenon', 'radon', 'ogane...
- 2850 Views
- 4 replies
- 2 kudos
Latest Reply
@Maria Bruevich​ - Do either of these answers help? If yes, would you be happy to mark one as best so that other members can find the solution more quickly?
3 More Replies
- 39103 Views
- 6 replies
- 0 kudos
No other output is available, not even output from cells that did run successfully.
Also, I'm unable to connect to spark ui or view the logs. It makes an attempt to load each of them, but after some time an error message appears saying it's unable ...
- 39103 Views
- 6 replies
- 0 kudos
Latest Reply
most of the time it's out of memory on driver node. check over all the drive log, data node log in Spark UI.
And check if u r collecting huge data to drive node, e.g. collect()
5 More Replies