4 weeks ago
I am unable to connect serverless compute under Free edition of DB, also in compute tab, I can see only the 3 tabs (SQL warehouses, Vector search, apps) not able to create new compute as we used to create in community edition
3 weeks ago
@Pavankumar7 - are you experiencing this issue for existing/imported notebooks, or for brand new notebooks too?
If it's the former, the notebook may be using an old serverless environment version. When Databricks updates the Serverless environment, existing notebooks don't automatically switch to point at the new version - you have to manually or programmatically update it.
In the notebook itself, click Environment (far right hand side of the screen), then look for "Environment version" and switch to a newer version.
I had a similar issue with some imported notebooks. I updated from Serverless Environment version 1 to version 2 and that fixed it. Interestingly, environment version 3 did not work, but environment version 2 did work.
4 weeks ago
Hey @Pavankumar7
I believe there may be differences between the free and Community editions.
Under the free edition limitations, you are only able to attach a single 2x-small cluster size:
I believe that's why you wont be able to create additional compute.
I hope this helps!
4 weeks ago
Hi Pavan,
You can use free Trail of 15 days, if you wish to use premium services for understanding more or playing around.
4 weeks ago - last edited 4 weeks ago
Thank you for your response.
Here is the catch, If I try to create new workspace and add notebook the serverless compute is working fine. But when you import the existing project dbc file in workspace and try to run the notebook the serverless compute is not working.
It will be helpful if someone provide the correct solution.!
4 weeks ago
Hi Pavankumar,
Since it's a DBC file Serverless compute should work but if it's not working then you can do this, export the current version from your new workspace and then delete the old folders structure and import the exported new version file. It should resolve the issue, the issue might be because the difference in schema/catalog structure or the serverless compute pointing to a different location. Better to create new schema & tables and run queries on top of them, make sure the linked service pointing to existing catalog/current catalog.
3 weeks ago
@Pavankumar7 - are you experiencing this issue for existing/imported notebooks, or for brand new notebooks too?
If it's the former, the notebook may be using an old serverless environment version. When Databricks updates the Serverless environment, existing notebooks don't automatically switch to point at the new version - you have to manually or programmatically update it.
In the notebook itself, click Environment (far right hand side of the screen), then look for "Environment version" and switch to a newer version.
I had a similar issue with some imported notebooks. I updated from Serverless Environment version 1 to version 2 and that fixed it. Interestingly, environment version 3 did not work, but environment version 2 did work.
3 weeks ago
Thank you @Thomas_W , Its working now ๐
Passionate about hosting events and connecting people? Help us grow a vibrant local communityโsign up today to get started!
Sign Up Now