02-20-2024 09:29 AM
Hi,
I am having an issue of loading source data into a delta table/ unity catalog. The error we are recieving is the following:
grpc_message:"[DELTA_EXCEED_CHAR_VARCHAR_LIMIT] Exceeds char/varchar type length limitation. Failed check: (isnull(\'metric_name) OR (length(\'metric_name) <= 0))
We get this issue when executing the line below:
04-26-2024 07:43 AM
Hi Palash,
We have Unity Catalogs created off external table locations, which are mounted to Databricks from azure DL. If a change of schema has come from upstream this causes this issue. To resolve, read in new source schema names from raw. Performed ALTER TABLE include these new columns then had to run UPDATE SET on the columns to give them an empty string ''. This then allowed for dataframe to overwrite the files in mounted storage without causing an error
02-20-2024 05:49 PM
Hey @Paul92S
Looking at the error message it looks like column "metric_name" is the culprit here:
Understanding the Error:
Troubleshooting Steps:
Follow-ups are appreciated!
04-26-2024 07:43 AM
Hi Palash,
We have Unity Catalogs created off external table locations, which are mounted to Databricks from azure DL. If a change of schema has come from upstream this causes this issue. To resolve, read in new source schema names from raw. Performed ALTER TABLE include these new columns then had to run UPDATE SET on the columns to give them an empty string ''. This then allowed for dataframe to overwrite the files in mounted storage without causing an error
05-22-2024 10:29 AM
Setting this config change before making the write command solved it for us: spark.conf.set("spark.sql.legacy.charVarcharAsString", True)
Join a Regional User Group to connect with local Databricks users. Events will be happening in your city, and you won’t want to miss the chance to attend and share knowledge.
If there isn’t a group near you, start one and help create a community that brings people together.
Request a New Group