4 weeks ago
How can I upload a data file (like a CSV, Excel, or JSON file) from my computer into Databricks so I can start analyzing it there?
4 weeks ago
Hi @Suheb ,
I think the easiest way is to use Data Ingestion tab:
And here you will be able to upload your local file:
For larger files, for other file formats, or for uploading files to a non-tabular dataset without creating a table the recommended approach is to use upload to a Volume in Unity Catalog.
4 weeks ago
Hi @Suheb ,
I think the easiest way is to use Data Ingestion tab:
And here you will be able to upload your local file:
For larger files, for other file formats, or for uploading files to a non-tabular dataset without creating a table the recommended approach is to use upload to a Volume in Unity Catalog.
4 weeks ago
The easiest way to import a local dataset into Databricks is by uploading it directly to the workspace. First, navigate to the Databricks workspace, then go to the "Data" tab. Click on "Add Data," select "Upload File," and choose your dataset from your local machine. Once uploaded, Databricks will store it in DBFS (Databricks File System). You can then access the file via Spark DataFrames using spark.read.csv() or other methods, depending on your file format. Alternatively, you can use the Databricks CLI or APIs for automation.
4 weeks ago
Hi @Suheb ,
The easiest way to bring your local dataset (CSV, Excel, JSON, etc.) into Databricks is by creating a Volume, uploading your files there, and then creating a table on top of that data.
Here’s how you can do it step by step:
In the Catalog Explorer, go to your schema (e.g., catalog.my_schema).
Click Create → Volume and give it a name (e.g., raw_data).
A Volume provides a secure, governed storage location for your files, managed by Unity Catalog.
2. Upload your local file
Open the Volume you just created.
Click Upload → Choose file, then select your CSV, Excel, or JSON file from your computer.
Databricks will store it in that Volume path ( Click and copy path).
3. Create a Table referencing the file
Once uploaded, you can create a table directly in SQL:
4. Keep adding new files (optional)
If you upload more files into the same Volume folder, the table will automatically read all of them, perfect for incremental uploads.
5.Alternative option
You can also use the “Upload Data” button from the Workspace UI (Data tab) if you just want a quick, one-off import. But Volumes are best if you plan to keep the data managed and reusable.
4 weeks ago
For large size relational Databases like from Oracle how can we move to Azure databricks. Non-unity catalog to unity catalog?
Passionate about hosting events and connecting people? Help us grow a vibrant local community—sign up today to get started!
Sign Up Now