I am encountering an error message during the executing a job (workflow-pipiline) with computing serverless as by default but it required Single node cluster and I am not able to see any option to create cluster in my subscription even I added my credit card and I have $390 credit as balance.
Currently I am facing an issue while running a job (workflow-pipiline).
"Directly accessing the underlying Spark driver JVM using the attribute 'sparkContext' is not supported on serverless compute. If you require direct access to these fields, consider using a single-user cluster. For more details on compatibility and limitations, check: https://docs.databricks.com/release-notes/serverless.html#limitationsFile <command-3107834473466653>, line 2"
Screenshot 2025-03-12 124910
Shailendra