cancel
Showing results for 
Search instead for 
Did you mean: 
Data Engineering
Join discussions on data engineering best practices, architectures, and optimization strategies within the Databricks Community. Exchange insights and solutions with fellow data engineers.
cancel
Showing results for 
Search instead for 
Did you mean: 

New spark cluster being configured in local mode

supremefist
New Contributor III

Hi,

We have two workspaces on Databricks, prod and dev. On prod, if we create a new all-purpose cluster through the web interface and go to Environment in the the spark UI, the spark.master setting is correctly set to be the host IP. This results in a cluster that is running in standalone mode.

However, we have a very similar workspace, dev, where if we create a new all-purpose cluster in exactly the same way, spark.master is set to be local[*], which means the cluster is running in local mode and does not try to make use of executors at all! There are no settings being overridden or defined differently in the cluster creation process that we are aware of!

Is there some spark configuration somewhere on the workspace or account level that we need to change in order for a new all-purpose cluster not to default to local mode?

Thanks in advance!

1 ACCEPTED SOLUTION

Accepted Solutions

supremefist
New Contributor III

Hey guys, thanks for getting back to me and sorry for the delay.

We managed to find the issue thanks to databricks support! There was a bash script that someone directly uploaded into dbfs:/databricks/init that was setting the cluster into local mode after startup 🙈

Once we deleted it, everything was fine again! Warning to future readers! 😀

View solution in original post

3 REPLIES 3

User16764241763
Honored Contributor

@Riaan Swart​ 

Thank you for posting to the Databricks community.

Could you please share a screenshot of the Cluster Configuration and Spark cluster UI - Master page?

supremefist
New Contributor III

Hey guys, thanks for getting back to me and sorry for the delay.

We managed to find the issue thanks to databricks support! There was a bash script that someone directly uploaded into dbfs:/databricks/init that was setting the cluster into local mode after startup 🙈

Once we deleted it, everything was fine again! Warning to future readers! 😀

scottb
New Contributor II

I found the same issue when choosing the default cluster setup on first setup that when I went to edit the cluster to add an instance profile, I was not able to save without fixing this. Thanks for the tip

Connect with Databricks Users in Your Area

Join a Regional User Group to connect with local Databricks users. Events will be happening in your city, and you won’t want to miss the chance to attend and share knowledge.

If there isn’t a group near you, start one and help create a community that brings people together.

Request a New Group