How to Create Cluster for running JOB
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
yesterday - last edited yesterday
I am encountering an error message during the executing a job (workflow-pipiline) with computing serverless as by default but it required Single node cluster and I am not able to see any option to create cluster in my subscription even I added my credit card and I have $390 credit as balance.
Currently I am facing an issue while running a job (workflow-pipiline).
"Directly accessing the underlying Spark driver JVM using the attribute 'sparkContext' is not supported on serverless compute. If you require direct access to these fields, consider using a single-user cluster. For more details on compatibility and limitations, check: https://docs.databricks.com/release-notes/serverless.html#limitationsFile <command-3107834473466653>, line 2"
Screenshot 2025-03-12 124910
- Labels:
-
ClusterCreation
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
3 hours ago
Hi @ShailenGenpact Could you check this post maybe this might help you
https://community.databricks.com/t5/data-engineering/sparkcontext-in-runtime-15-3/td-p/81646
In this they have said similar case and some alternate workarounds they have used

