03-16-2022 04:51 AM
Hi,
We have two workspaces on Databricks, prod and dev. On prod, if we create a new all-purpose cluster through the web interface and go to Environment in the the spark UI, the spark.master setting is correctly set to be the host IP. This results in a cluster that is running in standalone mode.
However, we have a very similar workspace, dev, where if we create a new all-purpose cluster in exactly the same way, spark.master is set to be local[*], which means the cluster is running in local mode and does not try to make use of executors at all! There are no settings being overridden or defined differently in the cluster creation process that we are aware of!
Is there some spark configuration somewhere on the workspace or account level that we need to change in order for a new all-purpose cluster not to default to local mode?
Thanks in advance!
04-01-2022 06:36 AM
Hey guys, thanks for getting back to me and sorry for the delay.
We managed to find the issue thanks to databricks support! There was a bash script that someone directly uploaded into dbfs:/databricks/init that was setting the cluster into local mode after startup 🙈
Once we deleted it, everything was fine again! Warning to future readers! 😀
03-16-2022 08:59 AM
@Riaan Swart
Thank you for posting to the Databricks community.
Could you please share a screenshot of the Cluster Configuration and Spark cluster UI - Master page?
04-01-2022 05:27 AM
Hi @Riaan Swart , Would you like to share the screenshot?
04-01-2022 06:36 AM
Hey guys, thanks for getting back to me and sorry for the delay.
We managed to find the issue thanks to databricks support! There was a bash script that someone directly uploaded into dbfs:/databricks/init that was setting the cluster into local mode after startup 🙈
Once we deleted it, everything was fine again! Warning to future readers! 😀
04-01-2022 07:02 AM
Cool!!
Isn't it cool for us to have you come back and share the solution with us?
Thank you @Riaan Swart !!
12-26-2022 07:05 PM
I found the same issue when choosing the default cluster setup on first setup that when I went to edit the cluster to add an instance profile, I was not able to save without fixing this. Thanks for the tip
Join a Regional User Group to connect with local Databricks users. Events will be happening in your city, and you won’t want to miss the chance to attend and share knowledge.
If there isn’t a group near you, start one and help create a community that brings people together.
Request a New Group