โ10-06-2022 01:58 PM
Hi guys,
I'm trying to use uuid in the merge but I always get an error...
import uuid
(
df_events.alias("events").merge(
source = df_updates.alias("updates"),
condition = "events.cod = updates.cod and events.num = updates.num"
).whenMatchedUpdate(
set =
{
"events.cod" : "updates.cod",
"events.num_c" : "updates.num_contrato",
"events.status": lit("Updated")"
}
).whenNotMatchedInsert(values =
{
"events.id" : uuid.uuid4(),
"events.cod" : "updates.cod",
"events.num" : "updates.num",
"events.status" : lit("Inserted")
}
)
.execute()
)
have any tips ?
โ10-07-2022 12:25 AM
Hi @William Scarduaโ
Could you please more details? Which DBR version you are currently using?
Please share the error stack trace or snapshot with us.
Is it while importing the UUID module?
โ10-07-2022 04:14 AM
Hi @Akash Bhatโ
My cluster version is: 9.1 LTS (includes Apache Spark 3.1.2, Scala 2.12)
This error occurs when I run the cell
TypeError: Values of dict in 'values' in whenNotMatchedInsert must contain only Spark SQL Columns or strings (expressions in SQL syntax) as values, found '202d282c-045a-402c-895f-832c4c3a5190' of type '<class 'uuid.UUID'>'
โ10-07-2022 11:32 AM
@William Scarduaโ :
Could you please refer to https://stackoverflow.com/questions/15859156/python-how-to-convert-a-valid-uuid-from-string-to-uuid
might be helpful
โ10-07-2022 02:26 PM
Hi @Sivaprasad C Sโ I tried to convert with uuid4().hex see the error it reported
โ
).whenNotMatchedInsert(values =
{
"events.id" : uuid.uuid4().hex,
"events.cod_operadora" : "updates.cod_operadora",
"events.num_contrato" : "updates.num_contrato",
"events.qtd_residencia_ok": "updates.qtd_residencia_ok",
"events.data_atualiz": lit("Inclusao")
}
โ
AnalysisException: cannot resolve `b6ff50957b0f492ca082e10227f07638` in INSERT clause given columns updates.`id` ...
โ10-14-2022 05:48 AM
wrap it in lit ๐ As it is looking for variable or field equal to uuid
lit(uuid.uuid4().hex)
โ10-21-2022 03:10 PM
โ11-16-2022 07:36 PM
Hi @William Scarduaโ
Hope all is well! Just wanted to check in if you were able to resolve your issue and would you be happy to share the solution or mark an answer as best? Else please let us know if you need more help.
We'd love to hear from you.
Thanks!
Join our fast-growing data practitioner and expert community of 80K+ members, ready to discover, help and collaborate together while making meaningful connections.
Click here to register and join today!
Engage in exciting technical discussions, join a group with your peers and meet our Featured Members.