cancel
Showing results for 
Search instead for 
Did you mean: 
Data Engineering
cancel
Showing results for 
Search instead for 
Did you mean: 

Databricks Delta Table

JD2
Contributor

Hello:

I am new to databricks and need little help on Delta Table creation.

I am having great difficulty to understand creating of delta table and they are:-

  1. Do I need to create S3 bucket for Delta Table? If YES then do I have to mount on the mountpoint?
  2. Do I need schema before creating table and can I ingest parquet file as per as per schema?
  3. I searched for step by step on databricks for delta table, but no luck.

Can you please help me on my above questions?

Thank you for your help & support.

1 ACCEPTED SOLUTION

Accepted Solutions

mathan_pillai
Valued Contributor
Valued Contributor

Hi Jay,

I would suggest to start with creating managed delta table. please run a simple command

CREATE TABLE events(id long) USING DELTA

This will create a managed delta table called "events"

Then perform

%sql describe extended events

The above command will show "location", where the data will be stored. This will be going to a default DBFS location.

This will give you an idea on how to create managed delta table and will tell you where the data is stored.

Thanks

Mathan

View solution in original post

6 REPLIES 6

Kaniz
Community Manager
Community Manager

Hi @ JayDAVE! My name is Kaniz, and I'm the technical moderator here. Great to meet you, and thanks for your question! Let's see if your peers in the community have an answer to your question first. Or else I will get back to you soon. Thanks.

-werners-
Esteemed Contributor III

you have 2 types of tables: managed and unmanaged. the managed ones store their data on the databricks storage account.

The unmanaged ones store the data somewhere else, your own data lake f.e.

If you use managed tables, the storage is already mounted, for unmanaged tables you have to mount the storage first.

https://docs.databricks.com/data/tables.html#managed-and-unmanaged-tables

Next you can create a table. Here you have the option to manually define the schema of the table, or to derive this schema automatically.

For the second option, it depends on the file type how this works. If your source file is parquet, delta lake automatically has the schema, for csv you can set inferschema to True.

https://docs.databricks.com/sql/language-manual/sql-ref-syntax-ddl-create-table-using.html

It is not hard to do, but you need to read a few pages of docs indeed.

Hello Werners:

Thanks for your reply.

After going thru documentations along with your links, I am not seeing any specific steps that first I have to create any directory (mkdir) to create folder and then to mount mountpoint.

can you please shed light on that.

Thanks

Jay

mathan_pillai
Valued Contributor
Valued Contributor

Hi Jay,

I would suggest to start with creating managed delta table. please run a simple command

CREATE TABLE events(id long) USING DELTA

This will create a managed delta table called "events"

Then perform

%sql describe extended events

The above command will show "location", where the data will be stored. This will be going to a default DBFS location.

This will give you an idea on how to create managed delta table and will tell you where the data is stored.

Thanks

Mathan

Hello Mathan:

Thanks for your reply, i got that part of understanding to create table. I am more interested to learn about the internal working of DBFS on AWS S3.

How the storage is managed by DBFS for Managed Tables? what are the limitations that DBFS can do & cannot.

Any links on that will be an great help.

Thank you again

Kaniz
Community Manager
Community Manager

Hi @Jay DAVE​ ,

Please go through the comprehensive description of DBFS here. Thanks.

Welcome to Databricks Community: Lets learn, network and celebrate together

Join our fast-growing data practitioner and expert community of 80K+ members, ready to discover, help and collaborate together while making meaningful connections. 

Click here to register and join today! 

Engage in exciting technical discussions, join a group with your peers and meet our Featured Members.