Hi @Janga Reddy,
You can use a job to run a data processing or data analysis task in a Databricks cluster with scalable resources.
Your job can consist of a single task or a large, multi-task workflow with complex dependencies.
Databricks manages the task orchestration, cluster management, monitoring, and error reporting for your jobs.
You can run your jobs immediately or periodically through an easy-to-use scheduling system. You can implement job tasks using notebooks, JARS, Delta Live Tables pipelines, Python, Scala, Spark submit, and Java applications.
You create jobs through the Jobs UI, the Jobs API, or the Databricks CLI. The Jobs UI allows you to monitor, test, and troubleshoot your running and completed jobs.
To get started:
- Create your first Databricks jobs workflow with the quickstart.
- Learn how to create, view, and run workflows with the Databricks jobs user interface.
- Learn about Jobs API updates to support creating and managing workflows with Databricks jobs.