cancel
Showing results for 
Search instead for 
Did you mean: 
Data Engineering
cancel
Showing results for 
Search instead for 
Did you mean: 

Couple of Delta Lake questions

jayallenmn
New Contributor III

Hey guys,

We're considering Delta Lake as the storage for our project and have a couple questions. The first one is what's the pricing for Delta Lake - can't seem to find a page that says x amount costs y.

The second question is more technical - if we want to use the python library to access our Delta Lake data instead of spark, do we have to convert the Delta Lake to a Pandas dataframe? This blog seems to say so https://databricks.com/blog/2020/12/22/natively-query-your-delta-lake-with-scala-java-and-python.htm.... Our concern is our Delta Lake will be many GBs of data and it won't fit in a single Pandas dataframe.

Jay

1 ACCEPTED SOLUTION

Accepted Solutions

-werners-
Esteemed Contributor III

delta lake itself is free. It is a file format. But you will have to pay for storage and compute of course.

If you want to use Databricks with delta lake, it will not be free unless you use the community edition.

Depending on what you are planning to do, the cost can be very low to very high.

You can use delta lake without databricks btw.

About your 2nd question: pandas is indeed an option. And your concern is exactly why distributed data processing frameworks like Spark were created.

If you want to avoid using Spark, you might wanna look into Dask or Ray.

View solution in original post

4 REPLIES 4

-werners-
Esteemed Contributor III

delta lake itself is free. It is a file format. But you will have to pay for storage and compute of course.

If you want to use Databricks with delta lake, it will not be free unless you use the community edition.

Depending on what you are planning to do, the cost can be very low to very high.

You can use delta lake without databricks btw.

About your 2nd question: pandas is indeed an option. And your concern is exactly why distributed data processing frameworks like Spark were created.

If you want to avoid using Spark, you might wanna look into Dask or Ray.

jayallenmn
New Contributor III

Thanks @Werner Stinckens​ - would you recommend processing delta lake data with databricks/spark?

-werners-
Esteemed Contributor III

totally!

Hi @Jay Allen​,

Just a friendly follow-up. Did any of the responses help you to resolve your question? if it did, please mark it as best. Otherwise, please let us know if you still need help.