I have a java app in form of jar package. This jar is installed on a Databricks cluster. This jar package reads and writes to few tables in databricks. In order to achieve that, I need SparkSession available in the code. Given that spark session is already running in Databricks, I need to somehow access it.
I tried various methods like SparkSession.builder, SparkSession.active and SparkSession.getActiveSession but none are available in databricks. How can I access the session inside my code?
I tried to pass session directly via main function arguments and it's working but as a requirement of the main function, I also need to pass some other String arguments so I need a way to get session in code some other way