Create table using a location
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
05-10-2024 04:42 AM - edited 05-10-2024 04:43 AM
Hi,
Databricks newbie here. I have copied delta files from my Synapse workspace into DBFS. To add them as a table, I executed.
create table audit_payload using delta location '/dbfs/FileStore/data/general/audit_payload'
The command executed properly. However, the SELECT on the table gives
[DELTA_READ_TABLE_WITHOUT_COLUMNS] You are trying to read a Delta table `spark_catalog`.`default`.`audit_payload` that does not have any columns.
How can I bring the table with all the column mapping
- Labels:
-
Delta Lake
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
05-17-2024 12:50 AM
can you read the delta lake files using spark.read.format("delta").load("path/to/delta/table")?
If not, it is not a valid delta lake table, which is my guess as creating a table from delta lake is nothing more than a semantic wrapper around the actual files.

