cancel
Showing results for 
Search instead for 
Did you mean: 
Data Engineering
Join discussions on data engineering best practices, architectures, and optimization strategies within the Databricks Community. Exchange insights and solutions with fellow data engineers.
cancel
Showing results for 
Search instead for 
Did you mean: 

Forum Posts

gnosis
by New Contributor II
  • 1419 Views
  • 1 replies
  • 1 kudos

Can anyone tell me why this notebook would fail when run as a Job?

When the notebook is run by the jobs/workflow scheduler, the data is never imported, but the files do get removed. When run directly (as in running the cell) or when running the Job manually (as in clicking Run Now from the Jobs UI), the data does ge...

  • 1419 Views
  • 1 replies
  • 1 kudos
Latest Reply
Kaniz_Fatma
Community Manager
  • 1 kudos

Hi @Justin Denick​, What DBR are you using?

  • 1 kudos
bluetail
by Contributor
  • 2211 Views
  • 4 replies
  • 2 kudos

Resolved! Value Labels fail to display in Databricks notebook but they are displayed ok in Jupyter

import matplotlib.pyplot as pltimport seaborn as snsimport pandas as pdimport numpy as npprob = np.random.rand(7) + 0.1prob /= prob.sum()df = pd.DataFrame({'department': np.random.choice(['helium', 'neon', 'argon', 'krypton', 'xenon', 'radon', 'ogane...

  • 2211 Views
  • 4 replies
  • 2 kudos
Latest Reply
Anonymous
Not applicable
  • 2 kudos

@Maria Bruevich​ - Do either of these answers help? If yes, would you be happy to mark one as best so that other members can find the solution more quickly?

  • 2 kudos
3 More Replies
JustinMills
by New Contributor III
  • 32842 Views
  • 6 replies
  • 0 kudos

Resolved! Job fails with "The spark driver has stopped unexpectedly and is restarting. Your notebook will be automatically reattached."

No other output is available, not even output from cells that did run successfully. Also, I'm unable to connect to spark ui or view the logs. It makes an attempt to load each of them, but after some time an error message appears saying it's unable ...

  • 32842 Views
  • 6 replies
  • 0 kudos
Latest Reply
lzlkni
New Contributor II
  • 0 kudos

most of the time it's out of memory on driver node. check over all the drive log, data node log in Spark UI. And check if u r collecting huge data to drive node, e.g. collect()

  • 0 kudos
5 More Replies
Labels