ā06-07-2023 11:13 PM
ā06-08-2023 03:42 AM
https://github.com/delta-io/delta/pull/1743
Spark 3.4 no longer requires users to provide all columns in insert-by-name queries.
Can you test with spark 3.4?
ā06-08-2023 01:06 AM
If you define a column list for an insert, Databricks will assign the corresponding default value instead. If there is no default value defined, it will insert null.
However, if one of those columns is not nullable, an error is raised.
Can you check if Client_Name is nullable in the delta lake table?
ā06-08-2023 01:53 AM
ā06-08-2023 01:57 AM
afaik it should do that.
What version of databricks are you on?
ā06-08-2023 02:05 AM
webapp_2023-06-03_03.32.06Z_master_
ā06-08-2023 02:23 AM
I mean databricks 11.3 etc. Can be seen on the cluster. Or databricks sql serverless etc
ā06-08-2023 02:27 AM
11.3 LTS (includes Apache Spark 3.3.0, Scala 2.12)
ā06-08-2023 02:32 AM
well, should work... can you try without the column selection on WithoutDups delta table?
ā06-08-2023 03:36 AM
ā06-08-2023 03:42 AM
https://github.com/delta-io/delta/pull/1743
Spark 3.4 no longer requires users to provide all columns in insert-by-name queries.
Can you test with spark 3.4?
ā06-09-2023 03:08 AM
Hi @Ranjith Marakalā,
Thank you for posting your question in our community! We are happy to assist you.
To help us provide you with the most accurate information, could you please take a moment to review the responses and select the one that best answers your question?
This will also help other community members who may have similar questions in the future. Thank you for your participation and let us know if you need any further assistance!
ā06-15-2023 07:36 PM
Thanks for sharing, nice content. Irish Platinum
Join a Regional User Group to connect with local Databricks users. Events will be happening in your city, and you wonāt want to miss the chance to attend and share knowledge.
If there isnāt a group near you, start one and help create a community that brings people together.
Request a New Group