cancel
Showing results forย 
Search instead forย 
Did you mean:ย 
Warehousing & Analytics
Engage in discussions on data warehousing, analytics, and BI solutions within the Databricks Community. Share insights, tips, and best practices for leveraging data for informed decision-making.
cancel
Showing results forย 
Search instead forย 
Did you mean:ย 

How to make a table in databricks using excel file

senkii
New Contributor II

I want to upload excel file to make a table by using UI in databricks.

since databricks doesn't accept excel files. So I tried to change the datatype of  all the data in excel to string first.

Then transform it to csv uft-8, and click the [create ]button in the UI. upload the csv.

But it failed. I found many error in it.

Does someone can help me?

1 ACCEPTED SOLUTION

Accepted Solutions

iyashk-DB
Databricks Employee
Databricks Employee

If your workspace has Genie spaces enabled, you can upload Excel (.xlsx) directly to Genie and analyze it there; itโ€™s designed for quick validation and NLQ over uploaded files and UC tables.

Otherwise, you can use the following approaches:

Option A: Use the โ€œCreate or modify a tableโ€ UI with a clean CSV

  • Open the Add or upload data flow and choose Create or modify a table. Select your CSV file, then pick an active compute to preview (SQL warehouse or serverless). Group clusters are not supported for this preview step.
  • In the format options for CSV, set these correctly:
    • Turn on/off First row contains the header to match your file. Header settings must be consistent across files you upload together.
    • Set Column delimiter to the actual separator (comma is default; only one character is allowed and backslash is not supported).
    • If schema inference caused casting problems, disable Automatically detect column types so all columns are treated as STRING; you can transform after table creation.
    • If any field contains embedded newlines, enable Rows span multiple lines before previewing.
  • Check and fix column names:
    • Column names must not include commas, backslashes, or Unicode characters; avoid special characters (spaces are commonly problematicโ€”rename to underscores before creating or rename later in a notebook/workflow).
  • Confirm size constraints:
    • The table upload UI supports up to 10 files with a combined size under 2 GB in a single operation.

Option B: Upload the file to a Unity Catalog volume, then create a table from it (robust and flexible)

  • Upload your CSV to a UC volume (any format is allowed for volumes; UI uploads are limited to 5 GB per file).
  • From Catalog Explorer, click the file and use Create table to convert it into a managed table with the same preview and configuration options (headers, delimiter, types) as the upload page.

View solution in original post

2 REPLIES 2

Raman_Unifeye
Contributor III

Latest blog about Read Excel feature of Databricks, however, its in beta.

https://docs.databricks.com/aws/en/query/formats/excel

Until then, pls follow this thread for excel processing - https://community.databricks.com/t5/community-articles/reading-excel-files-folder/td-p/77116

 

 


RG #Driving Business Outcomes with Data Intelligence

iyashk-DB
Databricks Employee
Databricks Employee

If your workspace has Genie spaces enabled, you can upload Excel (.xlsx) directly to Genie and analyze it there; itโ€™s designed for quick validation and NLQ over uploaded files and UC tables.

Otherwise, you can use the following approaches:

Option A: Use the โ€œCreate or modify a tableโ€ UI with a clean CSV

  • Open the Add or upload data flow and choose Create or modify a table. Select your CSV file, then pick an active compute to preview (SQL warehouse or serverless). Group clusters are not supported for this preview step.
  • In the format options for CSV, set these correctly:
    • Turn on/off First row contains the header to match your file. Header settings must be consistent across files you upload together.
    • Set Column delimiter to the actual separator (comma is default; only one character is allowed and backslash is not supported).
    • If schema inference caused casting problems, disable Automatically detect column types so all columns are treated as STRING; you can transform after table creation.
    • If any field contains embedded newlines, enable Rows span multiple lines before previewing.
  • Check and fix column names:
    • Column names must not include commas, backslashes, or Unicode characters; avoid special characters (spaces are commonly problematicโ€”rename to underscores before creating or rename later in a notebook/workflow).
  • Confirm size constraints:
    • The table upload UI supports up to 10 files with a combined size under 2 GB in a single operation.

Option B: Upload the file to a Unity Catalog volume, then create a table from it (robust and flexible)

  • Upload your CSV to a UC volume (any format is allowed for volumes; UI uploads are limited to 5 GB per file).
  • From Catalog Explorer, click the file and use Create table to convert it into a managed table with the same preview and configuration options (headers, delimiter, types) as the upload page.