cancel
Showing results forย 
Search instead forย 
Did you mean:ย 
Get Started Discussions
Start your journey with Databricks by joining discussions on getting started guides, tutorials, and introductory topics. Connect with beginners and experts alike to kickstart your Databricks experience.
cancel
Showing results forย 
Search instead forย 
Did you mean:ย 

Is it possible to pass a Spark session to other python files?

tomcorbin
New Contributor III

I am setting up pytest for my repo. I have my functions in separate python files and run pytest from one notebook. For each testing file, I have to create a new Spark session as follows:

@pytest.fixture(scope="session")
def spark():
  spark = (
  SparkSession.builder.master("local[1]")
  .appName("test")
  .config("spark.executors.cores", "1")
  .config("spark.executors.instances", "1")
  .config("spark.sql.shuffle.partitions", "1")
  .getOrCreate()
  )
  yield spark
  spark.stop()
 
Is it possible to create one Spark session and pass it to the test files that need it?
1 ACCEPTED SOLUTION

Accepted Solutions

tomcorbin
New Contributor III

I was able to do it by placing the Spark session fixture in the conftest.py file in the root directory. 

View solution in original post

1 REPLY 1

tomcorbin
New Contributor III

I was able to do it by placing the Spark session fixture in the conftest.py file in the root directory.