cancel
Showing results for 
Search instead for 
Did you mean: 
Data Engineering
Join discussions on data engineering best practices, architectures, and optimization strategies within the Databricks Community. Exchange insights and solutions with fellow data engineers.
cancel
Showing results for 
Search instead for 
Did you mean: 

Databricks Delta Table

JD2
Contributor

Hello:

I am new to databricks and need little help on Delta Table creation.

I am having great difficulty to understand creating of delta table and they are:-

  1. Do I need to create S3 bucket for Delta Table? If YES then do I have to mount on the mountpoint?
  2. Do I need schema before creating table and can I ingest parquet file as per as per schema?
  3. I searched for step by step on databricks for delta table, but no luck.

Can you please help me on my above questions?

Thank you for your help & support.

1 ACCEPTED SOLUTION

Accepted Solutions

mathan_pillai
Databricks Employee
Databricks Employee

Hi Jay,

I would suggest to start with creating managed delta table. please run a simple command

CREATE TABLE events(id long) USING DELTA

This will create a managed delta table called "events"

Then perform

%sql describe extended events

The above command will show "location", where the data will be stored. This will be going to a default DBFS location.

This will give you an idea on how to create managed delta table and will tell you where the data is stored.

Thanks

Mathan

View solution in original post

4 REPLIES 4

-werners-
Esteemed Contributor III

you have 2 types of tables: managed and unmanaged. the managed ones store their data on the databricks storage account.

The unmanaged ones store the data somewhere else, your own data lake f.e.

If you use managed tables, the storage is already mounted, for unmanaged tables you have to mount the storage first.

https://docs.databricks.com/data/tables.html#managed-and-unmanaged-tables

Next you can create a table. Here you have the option to manually define the schema of the table, or to derive this schema automatically.

For the second option, it depends on the file type how this works. If your source file is parquet, delta lake automatically has the schema, for csv you can set inferschema to True.

https://docs.databricks.com/sql/language-manual/sql-ref-syntax-ddl-create-table-using.html

It is not hard to do, but you need to read a few pages of docs indeed.

Hello Werners:

Thanks for your reply.

After going thru documentations along with your links, I am not seeing any specific steps that first I have to create any directory (mkdir) to create folder and then to mount mountpoint.

can you please shed light on that.

Thanks

Jay

mathan_pillai
Databricks Employee
Databricks Employee

Hi Jay,

I would suggest to start with creating managed delta table. please run a simple command

CREATE TABLE events(id long) USING DELTA

This will create a managed delta table called "events"

Then perform

%sql describe extended events

The above command will show "location", where the data will be stored. This will be going to a default DBFS location.

This will give you an idea on how to create managed delta table and will tell you where the data is stored.

Thanks

Mathan

Hello Mathan:

Thanks for your reply, i got that part of understanding to create table. I am more interested to learn about the internal working of DBFS on AWS S3.

How the storage is managed by DBFS for Managed Tables? what are the limitations that DBFS can do & cannot.

Any links on that will be an great help.

Thank you again

Connect with Databricks Users in Your Area

Join a Regional User Group to connect with local Databricks users. Events will be happening in your city, and you won’t want to miss the chance to attend and share knowledge.

If there isn’t a group near you, start one and help create a community that brings people together.

Request a New Group