โ10-12-2021 11:54 PM
I have created External table using spark via below command. (Using Data science & Engineering)
df.write.mode("overwrite").format("parquet").saveAsTable(name=f'{db_name}.{table_name}', path="dbfs:/reports/testing")
I have tried to delete a row based on filter condition using SQL endpoint (Using SQL )
DELETE FROM
testing.mobile_number_table
WHERE
x1 == 9940062964
Getting below error message.
Spark 3.0 Plans are not fully supported on table acl or credential passthrough clusters: DeleteFromTable (x1#6341L = 9940062964)
โ10-13-2021 02:38 AM
โ10-13-2021 02:38 AM
try using :
.format("delta")
if not help I would check dbfs mount
โ10-18-2021 12:05 AM
Hi @Hubert Dudekโ , Is this(delta) is the only way for updating & deleting records.
โ10-18-2021 03:34 AM
using delta file format is only ("real") way to delte smth from file as it is transaction file (so it make commit that record is deleted kind of sql/git)
In other data files it will require to overwrite everything every time.
โ10-13-2021 05:55 AM
Thank you.
โ10-13-2021 10:41 AM
if helped you can choose my answer as best one ๐
โ10-13-2021 10:46 AM
hi @karthick Jโ ,
It seems like the error is coming from your table permissions. Are you using a high concurrency cluster? if you do, then check if have table ACLs enable. Also try to test it using a standard cluster.
โ10-17-2021 11:28 PM
โ10-18-2021 02:48 PM
hi @karthick Jโ ,
Can you try to delete the row and execute your command in a non high concurrency cluster? the reason why im asking this is because we first need to isolate the error message and undertand why is happening to be able to find the best solution. Is this issue still blocking you or you are able to mitigate/solve this error?
โ10-20-2021 12:01 AM
HI @Jose Gonzalezโ ,
Have created a table via spark using non high concurrency cluster - not writing as delta format. And created the table via spark. Where I have tried to delete the row but getting the error 1. Via Spark Notebook SQL & 2. SQL query
df.write.mode("overwrite").format("parquet").saveAsTable(name=f'{db_name}.{table_name}', path="dbfs:/reports/testing")
Join a Regional User Group to connect with local Databricks users. Events will be happening in your city, and you wonโt want to miss the chance to attend and share knowledge.
If there isnโt a group near you, start one and help create a community that brings people together.
Request a New Group