cancel
Showing results for 
Search instead for 
Did you mean: 
Administration & Architecture
Explore discussions on Databricks administration, deployment strategies, and architectural best practices. Connect with administrators and architects to optimize your Databricks environment for performance, scalability, and security.
cancel
Showing results for 
Search instead for 
Did you mean: 

Can't create a new table from uploaded file.

Corwyn
New Contributor III
I've just started using the Community Edition through the AWS marketplace and I'm trying to set up tables to share with a customer.  I've managed to create 3 of the tables, but when uploading a small version of the fourth file, I'm having problems.
 
The original file that I want to upload is 14 GB, so for testing purposes I cut it down to the first 1000 rows (and then 100 rows when that didn't work).
Corwyn_1-1751310273366.png

It shows me a preview for 50 rows and 539 columns, as expected, and there aren't any error messages that I can find on the page.  I've tried auto-detection of column types and also turning it off, with no change.

However, I can't create the table:

Corwyn_2-1751310411172.png

Any ideas of how I can figure out what errors I need to fix?  Thank you!

-Corwyn

1 ACCEPTED SOLUTION

Accepted Solutions

Corwyn
New Contributor III

Thank you, Lou.

By loading manually, I found the error that wasn't being displayed in the UI. Once I took care of this, everything loaded just fine.

Corwyn_0-1751388961168.png

View solution in original post

2 REPLIES 2

BigRoux
Databricks Employee
Databricks Employee

Common Root Causes of Table Upload Failure

  1. File Size Limitations

    • The file upload UI in Databricks only supports a total upload size of under 2 GB, regardless of the number or size of individual files. Attempting to upload or create tables from files larger than this will not succeed.
  2. Column Count or Data Complexity

    • While there is no explicit column limit documented in the upload documentation, issues with very wide tables (e.g., several hundred columns) can cause schema inference or display problems in the UI, especially for files with 500+ columns.
  3. Preview vs. Table Creation

    • Even if the file previews correctly (e.g., showing 50 rows as expected), there may still be underlying schema inference or parsing issues that only surface when you attempt to "create" the table. The preview step does not perform full validation.
       
  4. Format and File Integrity

    • The file format must be one of the supported types: CSV, TSV, JSON, Avro, Parquet, or plain text, and not compressed (e.g., neither .zip nor .tar files). Any violation can silently fail table creation.
       
    • Consistency is required in header rows when uploading multiple files; mixed headers can cause silent data loss or schema confusion.
       
  5. Permission and Compute Requirements

    • You must have an active compute resource attached (a running cluster or SQL warehouse) and proper permissions to write to the destination schema. If a cluster is not active or you lack rights, table creation will not proceed (sometimes with little/no error detail).
       
  6. Workspace or Platform Limitations

    • Workspace admins can disable the file upload page entirely. Since Community Edition has limited features (and sometimes divergent UI), some functionality may be restricted or buggy compared to full Databricks workspaces.
    • Community Edition is known to have fewer debugging and logging tools available for users compared to paid tiers.

 

In short, Community Editions is really designed for learning on very small datasets.

Hope this helps, Lou.

Corwyn
New Contributor III

Thank you, Lou.

By loading manually, I found the error that wasn't being displayed in the UI. Once I took care of this, everything loaded just fine.

Corwyn_0-1751388961168.png

Join Us as a Local Community Builder!

Passionate about hosting events and connecting people? Help us grow a vibrant local community—sign up today to get started!

Sign Up Now