cancel
Showing results for 
Search instead for 
Did you mean: 
Data Engineering
Join discussions on data engineering best practices, architectures, and optimization strategies within the Databricks Community. Exchange insights and solutions with fellow data engineers.
cancel
Showing results for 
Search instead for 
Did you mean: 

How to create a SparkSession in jobs run-unit-tests

yhu126
New Contributor

I’m converting my Python unit tests to run with databricks jobs run-unit-tests.
Each test needs a SparkSession, but every pattern I try 

What I tried

1. Create my own local Spark
spark = (SparkSession.builder
.master("local[*]")
.appName("unit-test")
.getOrCreate())
Fails in Databricks with the “shared SparkContext” stack‑trace.

2. Generic builder (no master)
spark = SparkSession.builder.getOrCreate()
Works locally, but in Databricks sometimes raises [MASTER_URL_NOT_SET]
during fixture setup.

3. Try SparkContext first
from pyspark import SparkContext
sc = SparkContext.getOrCreate() 
spark = SparkSession.builder.getOrCreate()
I also tried wrapping the logic in a pytest fixture:

pytest.fixture(scope="session")
def spark_session():
spark = SparkSession.getActiveSession()
if spark is None:
spark = (SparkSession.builder
.master("local[*]")
.appName("unit-test")
.getOrCreate())
return spark

…but the builder.master("local[*]") branch is still executed in Databricks,
and the duplicate‑context error appears.

Has anyone solved this cleanly?

1 REPLY 1

szymon_dybczak
Esteemed Contributor III

Join Us as a Local Community Builder!

Passionate about hosting events and connecting people? Help us grow a vibrant local community—sign up today to get started!

Sign Up Now