cancel
Showing results forย 
Search instead forย 
Did you mean:ย 
Data Engineering
Join discussions on data engineering best practices, architectures, and optimization strategies within the Databricks Community. Exchange insights and solutions with fellow data engineers.
cancel
Showing results forย 
Search instead forย 
Did you mean:ย 

delta lake in Apache Spark

asethia
New Contributor

Hi,

As per documentation https://docs.delta.io/latest/quick-start.html , we can configure DeltaCatalog using spark.sql.catalog.spark_catalog.

The Iceberg supports two Catalog implementations (https://iceberg.apache.org/docs/latest/spark-configuration/#catalogs):

  • Replacing the session catalog (spark_catalog) - using org.apache.iceberg.spark.SparkSessionCatalog , It adds support for Iceberg tables to Sparkโ€™s built-in catalog, and delegates to the built-in catalog for non-Iceberg tables
  • Custom Catalog - org.apache.iceberg.spark.SparkCatalog - supports a Hive Metastore or a Hadoop warehouse as

Do we have an option similar to Iceberg in Delta Lake; where we can configure a custom catalog?

1 REPLY 1

Anonymous
Not applicable

@Arun Sethiaโ€‹ :

Yes, Delta Lake also supports custom catalogs. Delta Lake uses the Spark Catalog API, which allows for pluggable catalog implementations. You can implement your own custom catalog to use with Delta Lake.

To use a custom catalog, you can set the configuration property spark.sql.catalog.my_custom_catalog to the fully-qualified name of your custom catalog implementation. Then you can use Delta tables as usual by specifying the catalog and database in the table identifier, like so: my_custom_catalog.my_database.my_table.

Here's an example of how to create a custom catalog implementation for Delta Lake:

from pyspark.sql.catalog import Catalog
 
class MyCustomCatalog(Catalog):
    def __init__(self, spark_session):
        super().__init__(spark_session)
        # implementation details for your custom catalog
 
# set configuration property to use your custom catalog
spark.conf.set("spark.sql.catalog.my_custom_catalog", "com.example.MyCustomCatalog")
 
# use Delta tables with your custom catalog
df = spark.read.format("delta").table("my_custom_catalog.my_database.my_table")

In the above example, MyCustomCatalog is a custom implementation of the Catalog class provided by Spark, and spark.sql.catalog.my_custom_catalog is set to the fully-qualified name of that implementation. Then you can use Delta tables as usual, but with the custom catalog specified in the table identifier.

Hope this helps you to figure out your solution!

Connect with Databricks Users in Your Area

Join a Regional User Group to connect with local Databricks users. Events will be happening in your city, and you wonโ€™t want to miss the chance to attend and share knowledge.

If there isnโ€™t a group near you, start one and help create a community that brings people together.

Request a New Group