06-07-2023 11:13 PM
06-08-2023 03:42 AM
https://github.com/delta-io/delta/pull/1743
Spark 3.4 no longer requires users to provide all columns in insert-by-name queries.
Can you test with spark 3.4?
06-08-2023 01:06 AM
If you define a column list for an insert, Databricks will assign the corresponding default value instead. If there is no default value defined, it will insert null.
However, if one of those columns is not nullable, an error is raised.
Can you check if Client_Name is nullable in the delta lake table?
06-08-2023 01:53 AM
06-08-2023 01:57 AM
afaik it should do that.
What version of databricks are you on?
06-08-2023 02:05 AM
webapp_2023-06-03_03.32.06Z_master_
06-08-2023 02:23 AM
I mean databricks 11.3 etc. Can be seen on the cluster. Or databricks sql serverless etc
06-08-2023 02:27 AM
11.3 LTS (includes Apache Spark 3.3.0, Scala 2.12)
06-08-2023 02:32 AM
well, should work... can you try without the column selection on WithoutDups delta table?
06-08-2023 03:36 AM
06-08-2023 03:42 AM
https://github.com/delta-io/delta/pull/1743
Spark 3.4 no longer requires users to provide all columns in insert-by-name queries.
Can you test with spark 3.4?
06-09-2023 03:08 AM
Hi @Ranjith Marakal,
Thank you for posting your question in our community! We are happy to assist you.
To help us provide you with the most accurate information, could you please take a moment to review the responses and select the one that best answers your question?
This will also help other community members who may have similar questions in the future. Thank you for your participation and let us know if you need any further assistance!
06-15-2023 07:36 PM
Thanks for sharing, nice content. Irish Platinum
Passionate about hosting events and connecting people? Help us grow a vibrant local community—sign up today to get started!
Sign Up Now