Hi @kp12 The error you are encountering is because the PostgreSQL UUID data type is not directly compatible with the string type in Spark DataFrame. You can try casting the string to binary type before writing to the database. Here is how you can do it:
python
from pyspark.sql.functions import expr
df = df.withColumn("id", expr("uuid(id)"))
df.write.format("jdbc").option("url", "jdbc:postgresql://<hostname>:<port>/<database>").option("dbtable", "<table>").option("user", "<username>").option("password", "<password>").save()
In the above code, replace <hostname>, <port>, <database>, <table>, <username>, and <password> with your actual PostgreSQL database details. Please note that you need to use "jdbc" as the format in the write method, not "postgresql". The "uuid()" function in the expr method is used to convert the string type to binary type which is compatible with the UUID type in PostgreSQL.