Hi Everyone
I am writing a small function, with spark read from a csv and spark write into a table. I could execute this function within the notebook. But, when I register the same function as a unity catalog function and calling it from Playground, it is throwing Spark Exception. can someone tell what am I missing?
Code for reference:

error: == Error ==
SystemExit: -1
== Stacktrace ==
File "<udfbody>", line 28, in main
return ingest_csv(csv_path, table_name)
File "<udfbody>", line 14, in ingest_csv
spark = SparkSession.builder.getOrCreate()
File "/databricks/spark/python/pyspark/sql/session.py", line 574, in getOrCreate
else SparkContext.getOrCreate(sparkConf)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/databricks/spark/python/pyspark/core/context.py", line 579, in getOrCreate
SparkContext(conf=conf or SparkConf())
File "/databricks/spark/python/pyspark/core/context.py", line 207, in __init__
SparkContext._ensure_initialized(self, gateway=gateway, conf=conf)
File "/databricks/spark/python/pyspark/core/context.py", line 500, in _ensure_initialized
SparkContext._gateway = gateway or launch_gateway(conf)
^^^^^^^^^^^^^^^^^^^^
File "/databricks/spark/python/pyspark/java_gateway.py", line 63, in launch_gateway
SPARK_HOME = _find_spark_home()
^^^^^^^^^^^^^^^^^^
File "/databricks/spark/python/pyspark/find_spark_home.py", line 91, in _find_spark_home
sys.exit(-1) SQLSTATE: 39000
Any help here would be of great help.
Thank you
Regards\Giri