cancel
Showing results for 
Search instead for 
Did you mean: 
Data Engineering
cancel
Showing results for 
Search instead for 
Did you mean: 

How can I view and change the SparkConf settings if the SparkContext (sc) is already provided for me?

cfregly
Contributor
5 REPLIES 5

cfregly
Contributor

From the Clusters tab, select a cluster and view the Spark UI.

The Environment tab shows the current Spark configuration settings.

Here is an exhaustive list of the Spark Config params: https://spark.apache.org/docs/latest/configuration.html

The

SparkContext
is provided for you within the notebook UI, therefore you cannot change these values within your notebook code. Once
SparkConf
is passed to the
SparkContext
constructor, the values are cloned and cannot be changed. This is a Spark limitation.

One thing to note is that Databricks has already tuned Spark for the most common workloads running on the specific EC2 instance types used within Databricks Cloud.

In other words, you shouldn't have to changes these default values except in extreme cases. To change these defaults, please contact Databricks Cloud support.

If you're working with the SqlContext or HiveContext, you can manually set configuration properties using HiveQL's

SET key=value
command with
spark.sql.*
properties from this list, for example: https://spark.apache.org/docs/latest/sql-programming-guide.html#configuration.

However, overriding these configuration values may cause problems for other users of the cluster.

JonathanSpooner
New Contributor II

hi, may I know how did you handle the config for elasticsearch? I also have to stream data to elasticsearch.

There is a 'spark' tab in the cluster creation page, you can add the configs there before starting the cluster.

MatthewValenti
New Contributor II

This is an old post, however, is this still accurate for the latest version of Databricks in 2019? If so, how to approach the following?

1. Connect to many MongoDBs.

2. Connect to MongoDB when connection string information is dynamic (i.e. stored in spark table).

Welcome to Databricks Community: Lets learn, network and celebrate together

Join our fast-growing data practitioner and expert community of 80K+ members, ready to discover, help and collaborate together while making meaningful connections. 

Click here to register and join today! 

Engage in exciting technical discussions, join a group with your peers and meet our Featured Members.