05-12-2022 05:55 AM
Hi everyone, I have connected to Cosmos using this tutorial https://github.com/Azure/azure-sdk-for-java/tree/main/sdk/cosmos/azure-cosmos-spark_3_2-12/Samples/D...
After creating a table using a simple SQL command:
CREATE TABLE mydb.cosmos_table AS
SELECT *
FROM cosmosCatalog.mycosmosdb.mycosmoscontainer
LIMIT 100
After this statement finishes successfully, when querying or checking the table in the "Data" pane on the left, I receive this error. Apparently, the column "BankErrorDescription" is of type void, and then it can't be found according to the following error.
How to turn this void column into string for example? One approach would be to unload to storage and then load again but I'd prefer a more direct solution?
05-27-2022 02:56 AM
The way I solved it with more dynamically python was:
df = ...
cols = [(col[0], str(col[1])) for col in df.dtypes]
void_cols = [x[0] for x in cols if x[1] == 'void']
print(void_cols)
for col in void_cols:
df = df.withColumn(col, lit(None).cast('string'))
05-25-2022 11:19 PM
https://stackoverflow.com/questions/11563732/change-a-value-in-a-column-in-sqlite may be this also you can look for. and the above you mentioned https://github.com/Azure/azure-sdk-for-java/tree/main/sdk/cosmos/azure-cosmos-spark_3_2-12/Samples/D... , I am not the expert but may be this can help. @Joel iemma
05-26-2022 08:02 AM
Have you tried alter column?
ALTER TABLE [tbl_name]
ALTER COLUMN [col_name_1] [DATA_TYPE]
05-27-2022 02:56 AM
The way I solved it with more dynamically python was:
df = ...
cols = [(col[0], str(col[1])) for col in df.dtypes]
void_cols = [x[0] for x in cols if x[1] == 'void']
print(void_cols)
for col in void_cols:
df = df.withColumn(col, lit(None).cast('string'))
05-30-2022 11:43 AM
Thats good to know that you were able to solve this.
07-07-2022 09:14 AM
Hey there @Joel iemma
Hope all is well! Just wanted to check in if you would be happy to mark an answer as best for us, please? It would be really helpful for the other members too.
Cheers!
Join a Regional User Group to connect with local Databricks users. Events will be happening in your city, and you won’t want to miss the chance to attend and share knowledge.
If there isn’t a group near you, start one and help create a community that brings people together.
Request a New Group