10-06-2022 01:58 PM
Hi guys,
I'm trying to use uuid in the merge but I always get an error...
import uuid
(
df_events.alias("events").merge(
source = df_updates.alias("updates"),
condition = "events.cod = updates.cod and events.num = updates.num"
).whenMatchedUpdate(
set =
{
"events.cod" : "updates.cod",
"events.num_c" : "updates.num_contrato",
"events.status": lit("Updated")"
}
).whenNotMatchedInsert(values =
{
"events.id" : uuid.uuid4(),
"events.cod" : "updates.cod",
"events.num" : "updates.num",
"events.status" : lit("Inserted")
}
)
.execute()
)
have any tips ?
10-07-2022 12:25 AM
Hi @William Scardua
Could you please more details? Which DBR version you are currently using?
Please share the error stack trace or snapshot with us.
Is it while importing the UUID module?
10-07-2022 04:14 AM
Hi @Akash Bhat
My cluster version is: 9.1 LTS (includes Apache Spark 3.1.2, Scala 2.12)
This error occurs when I run the cell
TypeError: Values of dict in 'values' in whenNotMatchedInsert must contain only Spark SQL Columns or strings (expressions in SQL syntax) as values, found '202d282c-045a-402c-895f-832c4c3a5190' of type '<class 'uuid.UUID'>'
10-07-2022 11:32 AM
@William Scardua :
Could you please refer to https://stackoverflow.com/questions/15859156/python-how-to-convert-a-valid-uuid-from-string-to-uuid
might be helpful
10-07-2022 02:26 PM
Hi @Sivaprasad C S I tried to convert with uuid4().hex see the error it reported
).whenNotMatchedInsert(values =
{
"events.id" : uuid.uuid4().hex,
"events.cod_operadora" : "updates.cod_operadora",
"events.num_contrato" : "updates.num_contrato",
"events.qtd_residencia_ok": "updates.qtd_residencia_ok",
"events.data_atualiz": lit("Inclusao")
}
AnalysisException: cannot resolve `b6ff50957b0f492ca082e10227f07638` in INSERT clause given columns updates.`id` ...
10-14-2022 05:48 AM
wrap it in lit 🙂 As it is looking for variable or field equal to uuid
lit(uuid.uuid4().hex)
10-21-2022 03:10 PM
11-16-2022 07:36 PM
Hi @William Scardua
Hope all is well! Just wanted to check in if you were able to resolve your issue and would you be happy to share the solution or mark an answer as best? Else please let us know if you need more help.
We'd love to hear from you.
Thanks!
Join a Regional User Group to connect with local Databricks users. Events will be happening in your city, and you won’t want to miss the chance to attend and share knowledge.
If there isn’t a group near you, start one and help create a community that brings people together.
Request a New Group