cancel
Showing results for 
Search instead for 
Did you mean: 
Community Discussions
Connect with fellow community members to discuss general topics related to the Databricks platform, industry trends, and best practices. Share experiences, ask questions, and foster collaboration within the community.
cancel
Showing results for 
Search instead for 
Did you mean: 

[DeltaTable] Usage with Unity Catalog (ParseException)

Ludo
New Contributor III

Hi,

I'm migrating my workspaces to Unity Catalog and the application to use three-level notation. (catalog.database.table)

See: Tutorial: Delta Lake | Databricks on AWS

I'm having the following exception when trying to use DeltaTable.forName(string name) or DeltaTable.tableName(string name) with three-level notation such as catalog.database.table :

 

org.apache.spark.sql.catalyst.parser.ParseException :
[PARSE_SYNTAX_ERROR] Syntax error at or near '.'.(line 1, pos 18)

== SQL ==
spark_catalog.Gold.FactSyPerson
------------------^^^
at org.apache.spark.sql.catalyst.parser.ParseException.withCommand(ParseDriver.scala:306)
at org.apache.spark.sql.catalyst.parser.AbstractSqlParser.parse(ParseDriver.scala:144)
at org.apache.spark.sql.execution.SparkSqlParser.parse(SparkSqlParser.scala:52)
at org.apache.spark.sql.catalyst.parser.AbstractSqlParser.parseTableIdentifier(ParseDriver.scala:54)
at io.delta.sql.parser.DeltaSqlParser.parseTableIdentifier(DeltaSqlParser.scala:138)
at io.delta.tables.DeltaTable$.forName(DeltaTable.scala:783)
at io.delta.tables.DeltaTable$.forName(DeltaTable.scala:770)

 

Looks like it's not supported yet, could you please help? Is there a workaround?

Thank you.

2 REPLIES 2

saipujari_spark
Valued Contributor
Valued Contributor

Hey @Ludo I am able to access the table using 3 level namespace.

saipujari_spark_0-1691610542456.png

make sure

1. you are using a Unity catalog-enabled cluster

2. try using the latest DBR

Thanks,
Saikrishna Pujari
Sr. Spark Technical Solutions Engineer, Databricks

Ludo
New Contributor III

Thank you for the quick feedback @saipujari_spark 

Indeed, it's working great within a notebook with Databricks Runtime 13.2 which most likely has a custom behavior for unity catalog. 

It's not working in my scala application running in local with direct use of delta-core libraries 2.4.0. (no databricks runtime locally) There is a missing piece of software within delta-core libraries I guess.

I hope it will be updated soon if my understanding is correct.

Nevertheless, spark libraries are ready thanks to this MR => [SPARK-39235] Make Catalog API be compatible with 3-layer-namespace - ASF JIRA (apache.org)