cancel
Showing results for 
Search instead for 
Did you mean: 
Data Engineering
Join discussions on data engineering best practices, architectures, and optimization strategies within the Databricks Community. Exchange insights and solutions with fellow data engineers.
cancel
Showing results for 
Search instead for 
Did you mean: 

Serverless Compute - How to determine if being used programatically

Dave1967
New Contributor III

Hi,  We use a common notebook for all our "common" settings, this notebook is called in the first cell of each notebook we develop.  This issue we are now having is that we need 2 common notebooks, one for a normal shared compute and one for serverless as serverless does not allow (need) the same spark config commands.  What I'd like to do is determine in our common notebook whether it is running on serverless compute or not so I can have just one common notebook, any ideas if and how I can do this.

2 ACCEPTED SOLUTIONS

Accepted Solutions

filipniziol
Contributor III

Hi @Dave1967 ,

If you know any spark config command that is not supported in serverless, then build your logic around this command using try, catch:

def is_config_supported():
    try:
        spark.sparkContext.getConf()
        return True
    except:
        return False

print(is_config_supported())

View solution in original post

Dave1967
New Contributor III

Many thanks,

 

I did think about doing it this was, just wondered if there was something neater but actually this is fine.

Many thanks

View solution in original post

2 REPLIES 2

filipniziol
Contributor III

Hi @Dave1967 ,

If you know any spark config command that is not supported in serverless, then build your logic around this command using try, catch:

def is_config_supported():
    try:
        spark.sparkContext.getConf()
        return True
    except:
        return False

print(is_config_supported())

Dave1967
New Contributor III

Many thanks,

 

I did think about doing it this was, just wondered if there was something neater but actually this is fine.

Many thanks

Connect with Databricks Users in Your Area

Join a Regional User Group to connect with local Databricks users. Events will be happening in your city, and you won’t want to miss the chance to attend and share knowledge.

If there isn’t a group near you, start one and help create a community that brings people together.

Request a New Group