cancel
Showing results for 
Search instead for 
Did you mean: 
Databricks Free Trial Help
Engage in discussions about the Databricks Free Trial within the Databricks Community. Share insights, tips, and best practices for getting started, troubleshooting issues, and maximizing the value of your trial experience to explore Databricks' capabilities effectively.
cancel
Showing results for 
Search instead for 
Did you mean: 

Serverless All-purpose cluster in Free Edition

Kai-
New Contributor III

Hi Databricks community,

Good day.

Do anyone manage to use serverless all purpose cluster in the Databricks Free edition?

As referring to the documentation, it does mentioned that free edition do allow user to create a small serverless all purpose cluster. But in my free edition workspace, there's no option to create all purpose cluster and only serverless SQL warehouse is available.

**note: I am trying to run some pyspark code in notebook, and serverless SQL warehouse only support SQL. If free edition doesn't allow to create all purpose cluster, sounds like there's no other way to run non-SQL notebook/code in free edition.

https://docs.databricks.com/aws/en/getting-started/free-edition-limitations#compute-limitations

Kai_0-1750924137992.png

My Free edition workspace:

 

Kai_3-1750924506696.png

Thanks.

 

1 ACCEPTED SOLUTION

Accepted Solutions

ilir_nuredini
Honored Contributor

Hello @Kai- 

You can use serverless compute, and there is no need to provisioned it, Databricks handles that for you. 
All what you need to do is, on the top right side, click Connect and choose Serverless, and you are good to go 
to run non-SQL code notebooks.

ilir_nuredini_0-1750935735794.png


Hope that helps. Best, Ilir

 

View solution in original post

4 REPLIES 4

ilir_nuredini
Honored Contributor

Hello @Kai- 

You can use serverless compute, and there is no need to provisioned it, Databricks handles that for you. 
All what you need to do is, on the top right side, click Connect and choose Serverless, and you are good to go 
to run non-SQL code notebooks.

ilir_nuredini_0-1750935735794.png


Hope that helps. Best, Ilir

 

But in Databricks free edition, we have only serverless compute, how view spark job, spark UI ?

Kai-
New Contributor III

Hi @Navneet670 ,

Is a limitation of serverless compute, spark UI is not available

https://docs.databricks.com/aws/en/compute/serverless/limitations#general-limitations 

Kai_0-1757487126735.png

FYR, another community post that discuss about the same topic:
https://community.databricks.com/t5/get-started-discussions/is-spark-ui-available-on-the-databricks-... 

 

Thanks.

Best Regards,

Kai

 

Kai-
New Contributor III

Hi @ilir_nuredini,

Thanks for the suggestion, it worked!

Just sharing for information, noticed that it will prompt error when choosing Serverless in a imported notebook (tested with html and dbc). But works fine in a new notebook (with same pyspark code).

Imported via DBC:

Kai_0-1751006617349.png

Same code in new notebook:

Kai_1-1751006672597.png

Thanks.

Join Us as a Local Community Builder!

Passionate about hosting events and connecting people? Help us grow a vibrant local community—sign up today to get started!

Sign Up Now