cancel
Showing results for 
Search instead for 
Did you mean: 
Data Governance
Join discussions on data governance practices, compliance, and security within the Databricks Community. Exchange strategies and insights to ensure data integrity and regulatory compliance.
cancel
Showing results for 
Search instead for 
Did you mean: 

DeltaTable.forPath(spark, path) doesnt recognize table

amitca71
Contributor II

Hi,

I'm working with unity catalog for the last week.

I'm refering to delta table by path, as follwing:

path='s3://<my_bucket_name>/silver/data/<table_name>

DeltaTable.forPath(spark, path)

I get an exception that "is not a Delta table"

using the table name using: DeltaTable.forName(spark, <table_name>)

everything works fine. both the attributes are exactly as apear on uc catalog (and data apears in S3.

dbutils.fs.ls(path) - also recognizes the content

till yesterday it was working fine, and the problem started 23/9/22 on the morning.

Anybody face the same issue?

Thanks,

Amit

image

4 REPLIES 4

shan_chandra
Esteemed Contributor

@Amit Cahanovich​  - could you please share the full error stack trace?

AnalysisException: `s3://......` is not a Delta table.

---------------------------------------------------------------------------

AnalysisException Traceback (most recent call last)

<command-1069785358704323> in <cell line: 2>()

1 silver_table_uri

----> 2 DeltaTable.forPath(spark,silver_table_uri)

/databricks/spark/python/delta/tables.py in forPath(cls, sparkSession, path, hadoopConf)

385 jsparkSession: "JavaObject" = sparkSession._jsparkSession # type: ignore[attr-defined]

386

--> 387 jdt = jvm.io.delta.tables.DeltaTable.forPath(jsparkSession, path, hadoopConf)

388 return DeltaTable(sparkSession, jdt)

389

/databricks/spark/python/lib/py4j-0.10.9.5-src.zip/py4j/java_gateway.py in __call__(self, *args)

1319

1320 answer = self.gateway_client.send_command(command)

-> 1321 return_value = get_return_value(

1322 answer, self.gateway_client, self.target_id, self.name)

1323

/databricks/spark/python/pyspark/sql/utils.py in deco(*a, **kw)

200 # Hide where the exception came from that shows a non-Pythonic

201 # JVM exception message.

--> 202 raise converted from None

203 else:

204 raise

AnalysisException: `s3://.....` is not a Delta table.

amitca71
Contributor II

its even more weired. on one next cells it doesnt... see below older version even by name doesnt work

image 

on version 10.4

image&quot; data-fileid=&quot;0698Y00000MwjhGQAR

Connect with Databricks Users in Your Area

Join a Regional User Group to connect with local Databricks users. Events will be happening in your city, and you won’t want to miss the chance to attend and share knowledge.

If there isn’t a group near you, start one and help create a community that brings people together.

Request a New Group