Serverless compute vs Job cluster
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
02-21-2025 07:13 AM
Hi Guys,
For running the job with varying workload what should I use ? Serverless cluster or Job compute ?
What are positives and negatives?
(I'll be running my notebook from Azure data factory)
- Labels:
-
Spark
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
02-21-2025 08:33 AM
Hi
If you use PySpark for data processing than Job compute it has:
- lower cost
- support for PySpark
- flexible configuration
cons:
- Slower startup time
Serverless Warehouse:
- Faster startup time
- Dedicated for SQL
- Lower management
cons:
- not supporting PySpark
- more expensive for compute unit (Photon acceleration)
- less customization
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
02-21-2025 09:33 AM
It depends on cost, performance and startup time needed for your use-case.
Serverless compute is usually preferred choice because of its fast startup time and dynamic scaling. However, if your workload is long-running and predictable, job compute with auto scaling might be more cost-effective.

