Hi @Suheb ,
The easiest way to bring your local dataset (CSV, Excel, JSON, etc.) into Databricks is by creating a Volume, uploading your files there, and then creating a table on top of that data.
Hereโs how you can do it step by step:
- Create a Volume (recommended approach)
In the Catalog Explorer, go to your schema (e.g., catalog.my_schema).
Click Create โ Volume and give it a name (e.g., raw_data).
A Volume provides a secure, governed storage location for your files, managed by Unity Catalog.
2. Upload your local file
Open the Volume you just created.
Click Upload โ Choose file, then select your CSV, Excel, or JSON file from your computer.
Databricks will store it in that Volume path ( Click and copy path).
3. Create a Table referencing the file
CREATE TABLE main.default.my_table AS SELECT * FROM read_files('/Volumes/.../raw_data/', format => 'csv', inferSchema => true, header => true);
4. Keep adding new files (optional)
5.Alternative option