- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
10-02-2024 11:39 PM
Hi, We use a common notebook for all our "common" settings, this notebook is called in the first cell of each notebook we develop. This issue we are now having is that we need 2 common notebooks, one for a normal shared compute and one for serverless as serverless does not allow (need) the same spark config commands. What I'd like to do is determine in our common notebook whether it is running on serverless compute or not so I can have just one common notebook, any ideas if and how I can do this.
Accepted Solutions
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
10-03-2024 12:41 AM
Hi @Dave1967 ,
If you know any spark config command that is not supported in serverless, then build your logic around this command using try, catch:
def is_config_supported():
try:
spark.sparkContext.getConf()
return True
except:
return False
print(is_config_supported())
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
10-03-2024 04:12 AM
Many thanks,
I did think about doing it this was, just wondered if there was something neater but actually this is fine.
Many thanks
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
10-03-2024 12:41 AM
Hi @Dave1967 ,
If you know any spark config command that is not supported in serverless, then build your logic around this command using try, catch:
def is_config_supported():
try:
spark.sparkContext.getConf()
return True
except:
return False
print(is_config_supported())
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
10-03-2024 04:12 AM
Many thanks,
I did think about doing it this was, just wondered if there was something neater but actually this is fine.
Many thanks

