12-11-2021 06:15 AM
Hi All,
I am new to Databricks and am writing my first program.
Note: Code Shown Below:
I am creating a table with 3 columns to store data. 2 of the columns will be appended in from data that I have in another table.
When I run my append query into the empty table for 2 of the 3 fields, I get an error. I am assuming the the table is not allowing null values in the 3rd of 3 column/field.
Here is the error message:
"Column Member_cnt is not specified in INSERT."
If I add some value to the 3rd position, like" 2", the Append query works fine.
Here is my question:
How can a create a 3 column table that will accept, from an append query, data for only 2 of the 3 fields in the table (the third field gets updated later)?
My Code:
Create or Replace table MMR_Restated
(
BID Varchar(20),
YearMo long,
Member_cnt integer);
This works (3of3)
INSERT INTO MMR_Restated
(BID, YearMo, Member_cnt)
Select Bid, YearMo, 22
From All_Bids_All_Periods;
This doesn't Work (2of3)
INSERT INTO MMR_Restated
(BID, YearMo)
Select Bid, YearMo
From All_Bids_All_Periods;
Many Thanks
Kody_Devl
12-11-2021 11:30 AM
All is ok (although I don't like that behavior with INSERT SELECT 😉 )
Create or Replace table MMR_Restated
(
BID Varchar(20),
YearMo long,
Member_cnt integer);
INSERT INTO MMR_Restated
Select "AAA", 111, NULL;
Create or Replace table MMR_Restated
(
BID Varchar(20),
YearMo long,
Member_cnt integer GENERATED ALWAYS AS (0));
INSERT INTO MMR_Restated
(BID, YearMo)
Select "AAA", 111;
12-12-2021 02:52 AM
Hi Hubert,
Your answer moves me closer to being able to update pieces of a 26 field MMR_Restated table in pieces are the correct fields values are calculated Thru the process. I have been looking for a way to be able to update in "pieces"...... 2 fields, then 4 more, then the remaining 20 fields.
Varchar GENERATED ALWAYS AS ('') works for strings
What I really would like is:
GENERATED ALWAYS AS Null
But of course, that doesn't work.
Any thoughts?
Than you!
12-12-2021 08:36 AM
yeah GENERATED ALWAYS AS Null evaluate to false that's why iy is not working. I hope that it will be improved in the future. Spark community know about INSERT INTO issues , onhttps://spark.apache.org/docs/latest/sql-ref-syntax-dml-insert-into.html we can read:
Note:The current behaviour has some limitations:
Join our fast-growing data practitioner and expert community of 80K+ members, ready to discover, help and collaborate together while making meaningful connections.
Click here to register and join today!
Engage in exciting technical discussions, join a group with your peers and meet our Featured Members.