cancel
Showing results for 
Search instead for 
Did you mean: 
Data Engineering
Join discussions on data engineering best practices, architectures, and optimization strategies within the Databricks Community. Exchange insights and solutions with fellow data engineers.
cancel
Showing results for 
Search instead for 
Did you mean: 

Delta table cannot be previewed in the Data UI

chanansh
Contributor

I have saved a delta table which I can read using `spark.table(table_name)`. However, when I go to the "Data" panel in databricks I get an error when I choose that table.

An error occurred while fetching table: <table_name>

com.databricks.backend.common.rpc.DatabricksExceptions$SQLExecutionException: org.apache.spark.sql.AnalysisException: Incompatible format detected.

A transaction log for Databricks Delta was found at `<path>/_delta_log`,

but you are trying to read from `<path>` using format("parquet"). You must use

'format("delta")' when reading and writing to a delta table.

To disable this check, SET spark.databricks.delta.formatCheck.enabled=false

To learn more about Delta, see https://docs.databricks.com/delta/index.html

3 REPLIES 3

Debayan
Databricks Employee
Databricks Employee

Hi, It can be an issue with Hive Metastore. Could you please recheck the configuration and also let us know if you are using external metastore?

chanansh
Contributor

I don't know. The way I have saved the table was with autoloader as follows:

​I am saving a structure stream into a table using:

```

.writeStream

.format("delta") # <-----------

.option("checkpointLocation", checkpoint_path)

.option("path", output_path)

.trigger(availableNow=True)

.toTable(table_name, format='delta'))

```

Yet the produced table seems not to be a delta one as I cannot read it in Redash, nor the Data preview feature in databricks. Moreover if I run delta.DeltaTable.isDeltaTable(spark, TABLE_NAME) it returns False.

did you wrote any data? did your micro-batch process any data? you can check if you do "%sql describe extended <delta_table_name>" in the output you will find the path to the location in which your data is located. Then you can try to list the data in that path. You need to have a _delta_log file in there

Connect with Databricks Users in Your Area

Join a Regional User Group to connect with local Databricks users. Events will be happening in your city, and you won’t want to miss the chance to attend and share knowledge.

If there isn’t a group near you, start one and help create a community that brings people together.

Request a New Group