I am setting up pytest for my repo. I have my functions in separate python files and run pytest from one notebook. For each testing file, I have to create a new Spark session as follows:
@pytest.fixture(scope="session")
def spark():
spark = (
SparkSession.builder.master("local[1]")
.appName("test")
.config("spark.executors.cores", "1")
.config("spark.executors.instances", "1")
.config("spark.sql.shuffle.partitions", "1")
.getOrCreate()
)
yield spark
spark.stop()
Is it possible to create one Spark session and pass it to the test files that need it?