- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
โ06-07-2023 11:13 PM
Accepted Solutions
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
โ06-08-2023 03:42 AM
https://github.com/delta-io/delta/pull/1743
Spark 3.4 no longer requires users to provide all columns in insert-by-name queries.
Can you test with spark 3.4?
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
โ06-08-2023 01:06 AM
If you define a column list for an insert, Databricks will assign the corresponding default value instead. If there is no default value defined, it will insert null.
However, if one of those columns is not nullable, an error is raised.
Can you check if Client_Name is nullable in the delta lake table?
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
โ06-08-2023 01:53 AM
Thanks for the reply!
yes i did checked the client_name , all columns are nullable
is there any way i can make client_name and remaining columns values as null ,if yes what is the query
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
โ06-08-2023 01:57 AM
afaik it should do that.
What version of databricks are you on?
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
โ06-08-2023 02:05 AM
webapp_2023-06-03_03.32.06Z_master_
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
โ06-08-2023 02:23 AM
I mean databricks 11.3 etc. Can be seen on the cluster. Or databricks sql serverless etc
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
โ06-08-2023 02:27 AM
11.3 LTS (includes Apache Spark 3.3.0, Scala 2.12)
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
โ06-08-2023 02:32 AM
well, should work... can you try without the column selection on WithoutDups delta table?
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
โ06-08-2023 03:36 AM
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
โ06-08-2023 03:42 AM
https://github.com/delta-io/delta/pull/1743
Spark 3.4 no longer requires users to provide all columns in insert-by-name queries.
Can you test with spark 3.4?
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
โ06-09-2023 03:08 AM
Hi @Ranjith Marakalโ,
Thank you for posting your question in our community! We are happy to assist you.
To help us provide you with the most accurate information, could you please take a moment to review the responses and select the one that best answers your question?
This will also help other community members who may have similar questions in the future. Thank you for your participation and let us know if you need any further assistance!
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
โ06-15-2023 07:36 PM
Thanks for sharing, nice content. Irish Platinum

