How to retrieve Spark Session inside java jar library installed on Cluster
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
07-17-2024 11:42 PM
I have a java app in form of jar package. This jar is installed on a Databricks cluster. This jar package reads and writes to few tables in databricks. In order to achieve that, I need SparkSession available in the code. Given that spark session is already running in Databricks, I need to somehow access it.
I tried various methods like SparkSession.builder, SparkSession.active and SparkSession.getActiveSession but none are available in databricks. How can I access the session inside my code?
I tried to pass session directly via main function arguments and it's working but as a requirement of the main function, I also need to pass some other String arguments so I need a way to get session in code some other way
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
10-04-2024 11:23 AM
Hi , I am having a JAVA jar in volume and trying to get the existing session with unity catalog enabled . I only get the spark_catalog. Any help will be appreciated .
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
07-18-2024 05:46 AM
Thank you for reply. This solves the issue. I was passing ".appName" as well with builder which was causing the issue. I tried the solution you mentioned it worked. Thank you
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
10-11-2024 03:13 AM
Thanks for the update, I will try it too.

