Following. Having a similar issue in which setting num_workers to 0 doesn't work. When I deploy the bundle:Error: cannot update job: NumWorkers could be 0 only for SingleNode clusters.- job_cluster_key: ${bundle.name}new_cluster:cluster_name: ""spark...
I ran into a similar error just now, and in my case, Pycharm was running some iPython startup scripts each time it opened a console. There was, for some reason, a file at `~/.ipython/profile_default/startup/00-databricks-init-a5acf3baa440a896fa364d18...
Things seem to be mostly working for me now. I've added a bit more detail on my connection steps and process in case it's helpful for anyone on Stack Overflow: https://stackoverflow.com/questions/76407426/connecting-rstudio-desktop-to-databricks-comm...