Hi @narvinya,
โข Delta tables can be accessed outside of Databricks UI notebook without using Databricks-connect or Unity Catalog.
โข Three options are available for working with Delta tables outside of Databricks UI notebook:
1. Using JDBC: Read and write Delta tables using JDBC in your local Spark session. Delta-specific operations like OPTIMIZE
, VACUUM
, or REPAIR
require executing commands in a Databricks notebook or using the Databricks REST API.
2. Using Delta Lake CLI: Install the Delta Lake CLI on your local machine to interact with Delta tables. Perform Delta operations such as reading data, writing data, running SQL queries, and executing Delta-specific commandsโno need for a Databricks notebook or Unity Catalog.
3. Using Delta Lake Python API: Use the Delta Lake Python API to work with Delta tables using Python. Read and write Delta tables, perform data manipulation operations, and execute Delta-specific commands. Execute Delta-specific commands in a Databricks notebook or using the Databricks REST API.