What's the best way to develop Apache Spark Jobs from an IDE (such as IntelliJ/Pycharm)?
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
06-07-2021 10:53 AM
A number of people like developing locally using an IDE and then deploying. What are the recommended ways to do that with Databricks jobs?
- Labels:
-
IDE Dev Support
-
Spark
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
06-07-2021 10:57 AM
The Databricks Runtime and Apache Spark use the same base API. One can create Spark jobs that run locally and have them run on Databricks with all available Databricks features.
It is required that one uses SparkSession.builder.getOrCreate() to create the SparkSession. The SparkSession is created in the Databricks environment and is treated as a singleton.
In addition, one can also test using databricks connect. Databricks connect replaces Apache Spark/Pyspark on your local machine and allows for your local machine to execute jobs on a Databricks cluster. https://docs.databricks.com/dev-tools/databricks-connect.html

