If you hit an issue related to an already existing Spark context, you can only have one SparkContext instance in a single JVM. In such cases, you can try the following approach
from pyspark import SparkContext
from pyspark.sql import SparkSession
# Check if a Spark context already exists
try:
sc = SparkContext.getOrCreate()
spark = SparkSession(sc)
print("Using existing Spark context.")
except Exception as e:
print("No existing Spark context found. Creating a new one.")
sc = SparkContext()
spark = SparkSession(sc)
Mich Talebzadeh | Technologist | Data | Generative AI | Financial Fraud
London
United Kingdom
view my Linkedin profile
https://en.everybodywiki.com/Mich_Talebzadeh
Disclaimer: The information provided is correct to the best of my knowledge but of course cannot be guaranteed . It is essential to note that, as with any advice, quote "one test result is worth one-thousand expert opinions (Werner Von Braun)".