cancel
Showing results forย 
Search instead forย 
Did you mean:ย 
Data Engineering
cancel
Showing results forย 
Search instead forย 
Did you mean:ย 

What is the best approach to use Delta tables without Unity Catalog enabled?

narvinya
New Contributor

Hello!

I would like to work with delta tables outside of Databricks UI notebook. I know that the best option would be to use databricks-connect but I donโ€™t have Unity Catalog enabled.

What would be the most effective way to do so? I know that via JDBC using spark it is possible to read/write delta tables but is there an option to make some delta operations directly? Is there any other option than doing delta related operation in my local spark session? 

1 ACCEPTED SOLUTION

Accepted Solutions

Kaniz
Community Manager
Community Manager

Hi @narvinya

โ€ข Delta tables can be accessed outside of Databricks UI notebook without using Databricks-connect or Unity Catalog.


โ€ข Three options are available for working with Delta tables outside of Databricks UI notebook:


 1. Using JDBC: Read and write Delta tables using JDBC in your local Spark session. Delta-specific operations like OPTIMIZEVACUUM, or REPAIR require executing commands in a Databricks notebook or using the Databricks REST API.
 2. Using Delta Lake CLI: Install the Delta Lake CLI on your local machine to interact with Delta tables. Perform Delta operations such as reading data, writing data, running SQL queries, and executing Delta-specific commandsโ€”no need for a Databricks notebook or Unity Catalog.
 3. Using Delta Lake Python API: Use the Delta Lake Python API to work with Delta tables using Python. Read and write Delta tables, perform data manipulation operations, and execute Delta-specific commands. Execute Delta-specific commands in a Databricks notebook or using the Databricks REST API.

View solution in original post

1 REPLY 1

Kaniz
Community Manager
Community Manager

Hi @narvinya

โ€ข Delta tables can be accessed outside of Databricks UI notebook without using Databricks-connect or Unity Catalog.


โ€ข Three options are available for working with Delta tables outside of Databricks UI notebook:


 1. Using JDBC: Read and write Delta tables using JDBC in your local Spark session. Delta-specific operations like OPTIMIZEVACUUM, or REPAIR require executing commands in a Databricks notebook or using the Databricks REST API.
 2. Using Delta Lake CLI: Install the Delta Lake CLI on your local machine to interact with Delta tables. Perform Delta operations such as reading data, writing data, running SQL queries, and executing Delta-specific commandsโ€”no need for a Databricks notebook or Unity Catalog.
 3. Using Delta Lake Python API: Use the Delta Lake Python API to work with Delta tables using Python. Read and write Delta tables, perform data manipulation operations, and execute Delta-specific commands. Execute Delta-specific commands in a Databricks notebook or using the Databricks REST API.

Welcome to Databricks Community: Lets learn, network and celebrate together

Join our fast-growing data practitioner and expert community of 80K+ members, ready to discover, help and collaborate together while making meaningful connections. 

Click here to register and join today! 

Engage in exciting technical discussions, join a group with your peers and meet our Featured Members.