cancel
Showing results for 
Search instead for 
Did you mean: 
Data Engineering
Join discussions on data engineering best practices, architectures, and optimization strategies within the Databricks Community. Exchange insights and solutions with fellow data engineers.
cancel
Showing results for 
Search instead for 
Did you mean: 

Notebook Import Failed Due to Workspace Quota Exceeded Error

singhanuj2803
New Contributor II

I am using Databricks Community Edition. I am writing to seek assistance regarding an issue I encountered while attempting to upload a notebook to my Databricks Community Edition Workspace. I received the following error message:

Screenshot 2024-11-10 205232.png

It appears that I have exceeded the allowed Databricks Community Edition quota. I would greatly appreciate your guidance on how to resolve this issue. Specifically, I would like to understand:

  1. The best practices for managing and reducing Databricks Community Edition workspace storage.

  2. Any immediate steps I can take to delete unnecessary items and free up space.

  3. Potential options for increasing the Databricks Community Edition workspace quota, if available.

Thank you for your support. I look forward to your prompt response and assistance in resolving this matter.

1 ACCEPTED SOLUTION

Accepted Solutions

Walter_C
Databricks Employee
Databricks Employee

To resolve this issue, you need to free up some space in your workspace. Here are the steps you can follow:

  1. Delete Unnecessary Notebooks or Files: Go through your workspace and identify any notebooks, files, or directories that are no longer needed. You can delete these items to free up space.

  2. Export and Backup Important Notebooks: If you have important notebooks that you do not want to delete, consider exporting them to your local machine or another storage service for backup. Once exported, you can delete them from the workspace to free up space.

  3. Check for Large Files: Sometimes, large files or datasets can consume significant space. Identify and remove any large files that are not currently needed.

View solution in original post

2 REPLIES 2

Walter_C
Databricks Employee
Databricks Employee

To resolve this issue, you need to free up some space in your workspace. Here are the steps you can follow:

  1. Delete Unnecessary Notebooks or Files: Go through your workspace and identify any notebooks, files, or directories that are no longer needed. You can delete these items to free up space.

  2. Export and Backup Important Notebooks: If you have important notebooks that you do not want to delete, consider exporting them to your local machine or another storage service for backup. Once exported, you can delete them from the workspace to free up space.

  3. Check for Large Files: Sometimes, large files or datasets can consume significant space. Identify and remove any large files that are not currently needed.

singhanuj2803
New Contributor II

Thanks . What is the maximum storage limit of Databricks Community Edition ?

Connect with Databricks Users in Your Area

Join a Regional User Group to connect with local Databricks users. Events will be happening in your city, and you won’t want to miss the chance to attend and share knowledge.

If there isn’t a group near you, start one and help create a community that brings people together.

Request a New Group