- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
08-27-2023 03:22 AM
In databricks I can set a config variable at session level, but it is not found in the context variables:
spark.conf.set(f"dataset.bookstore", '123') #dataset_bookstore spark.conf.get(f"dataset.bookstore")#123 scf = spark.sparkContext.getConf() allc = scf.getAll() scf.contains(f"dataset.bookstore") # False
I understand there is a difference between session and context-level config variables, how can I retrieve all session-level variables using spark.conf?
Note: all_session_vars = spark.conf.getAll()
returns
AttributeError: 'RuntimeConfig' object has no attribute 'getAll'
so it looks like a runtime-level config
Accepted Solutions
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
08-29-2023 04:29 AM - edited 08-29-2023 04:31 AM
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
08-29-2023 01:09 AM
wrote:
spark.conf.set(f"dataset.bookstore", '123') #dataset_bookstore spark.conf.get(f"dataset.bookstore")#123 scf = spark.sparkContext.getConf() allc = scf.getAll() scf.contains(f"dataset.bookstore") # False
spark.conf? Mayo Clinic Patient Portal
Note: all_session_vars = spark.conf.getAll()
returns
AttributeError: 'RuntimeConfig' object has no attribute 'getAll'
so it looks like a runtime-level config
Hello,
In Databricks, you can set session-level configuration variables using spark.conf.set(), but these session-level variables are distinct from the context-level variables. While you can retrieve session-level variables using spark.conf.get(), you cannot directly retrieve all session-level variables using spark.conf.getAll().
session_conf = spark.sparkContext.getConf()
all_session_vars = [(key, session_conf.get(key)) for key in session_conf.getAll()]
# Now all_session_vars contains a list of tuples with session-level variables
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
08-29-2023 04:29 AM - edited 08-29-2023 04:31 AM
Turns out there is a way, see https://stackoverflow.com/questions/76986516/how-to-retrieve-all-spark-session-config-variables
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
08-08-2024 09:54 AM - edited 08-08-2024 09:54 AM
A while back I think I found a way to get python to list all the config values. I was not able to re-create it. Just make one of your notebook code sections scala (first line) and use the second line:

