- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
12-02-2021 02:28 PM
Hello:
I am new to databricks and need little help on Delta Table creation.
I am having great difficulty to understand creating of delta table and they are:-
- Do I need to create S3 bucket for Delta Table? If YES then do I have to mount on the mountpoint?
- Do I need schema before creating table and can I ingest parquet file as per as per schema?
- I searched for step by step on databricks for delta table, but no luck.
Can you please help me on my above questions?
Thank you for your help & support.
- Labels:
-
Databricks Delta Table
-
Delta table
Accepted Solutions
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
12-08-2021 04:37 PM
Hi Jay,
I would suggest to start with creating managed delta table. please run a simple command
CREATE TABLE events(id long) USING DELTA
This will create a managed delta table called "events"
Then perform
%sql describe extended events
The above command will show "location", where the data will be stored. This will be going to a default DBFS location.
This will give you an idea on how to create managed delta table and will tell you where the data is stored.
Thanks
Mathan
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
12-03-2021 12:43 AM
you have 2 types of tables: managed and unmanaged. the managed ones store their data on the databricks storage account.
The unmanaged ones store the data somewhere else, your own data lake f.e.
If you use managed tables, the storage is already mounted, for unmanaged tables you have to mount the storage first.
https://docs.databricks.com/data/tables.html#managed-and-unmanaged-tables
Next you can create a table. Here you have the option to manually define the schema of the table, or to derive this schema automatically.
For the second option, it depends on the file type how this works. If your source file is parquet, delta lake automatically has the schema, for csv you can set inferschema to True.
https://docs.databricks.com/sql/language-manual/sql-ref-syntax-ddl-create-table-using.html
It is not hard to do, but you need to read a few pages of docs indeed.
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
12-03-2021 09:58 AM
Hello Werners:
Thanks for your reply.
After going thru documentations along with your links, I am not seeing any specific steps that first I have to create any directory (mkdir) to create folder and then to mount mountpoint.
can you please shed light on that.
Thanks
Jay
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
12-08-2021 04:37 PM
Hi Jay,
I would suggest to start with creating managed delta table. please run a simple command
CREATE TABLE events(id long) USING DELTA
This will create a managed delta table called "events"
Then perform
%sql describe extended events
The above command will show "location", where the data will be stored. This will be going to a default DBFS location.
This will give you an idea on how to create managed delta table and will tell you where the data is stored.
Thanks
Mathan
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
12-08-2021 05:59 PM
Hello Mathan:
Thanks for your reply, i got that part of understanding to create table. I am more interested to learn about the internal working of DBFS on AWS S3.
How the storage is managed by DBFS for Managed Tables? what are the limitations that DBFS can do & cannot.
Any links on that will be an great help.
Thank you again