Error in SQL statement: AnalysisException: Cannot up cast documents from array
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
10-05-2022 12:45 PM
Hi Everyone,
I am getting the following error when running a SQL query and do not understand what it means or what can be done to resolve it. Any recommendations?
View DDL:
CREATE VIEW myschema.table (
accountId,
agreementType,
capture_file_name,
capture_file_path,
createdDate,
currency,
de_agentid,
de_applicationshardid,
de_datacontenttype,
de_eventapplicationtime,
de_eventmode,
de_eventpublishtime,
de_eventsequenceid,
de_id,
de_partitionkey,
de_source,
documents,
effectiveDate,
eh_EnqueuedTimeUtc,
eh_Offset,
eh_SequenceNumber,
eh_SystemProperties_x_opt_enqueued_time,
eh_SystemProperties_x_opt_kafka_key,
endDate,
expirationDate,
externalId,
externalSource,
id,
isInWorkflow,
isSigned,
name,
notice,
noticeDate,
noticePeriod,
parties,
reconciled_file_name_w_path,
requestor,
resourceVersion,
status,
terminateForConvenience,
updatedDate,
value,
de_action,
de_eventapplication_year,
de_eventapplication_month,
de_eventapplication_day,
de_eventapplication_hour,
de_eventapplication_minute)
TBLPROPERTIES (
'transient_lastDdlTime' = '1664473495')
AS select * from parquet.`/mnt/store/de_entitytype=Agreement`
SQL query:
select de_id from myschema.table;
Error:
Error in SQL statement: AnalysisException: Cannot up cast documents from array<struct<accountId:string,agreementId:string,createdBy:string,createdDate:string,id:string,obligations:array<string>,resourceVersion:bigint,updatedBy:string,updatedDate:string>> to array<string>.
The type path of the target object is:
You can either add an explicit cast to the input data or choose a higher precision type of the field in the target object
Any recommendations on how to resolve this error would be much appreciated 🙂
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
10-06-2022 01:01 AM
Seems like the nested struct gives you some issues.
Can you check this to see if it helps?
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
10-06-2022 09:18 AM
Thank you @Werner Stinckens , but it does not help much. The article you've linked explains new features of Databricks Runtime 3.0 for working with complex data types.
I am simply selecting a single column with a simple data type from a view that also has a column with complex data type. This query should not be failing like that. I hope this makes sense.
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
10-10-2022 01:22 AM
Sorry, I missed the select query.
Can you check if the view returns data when using the explorer (so in the Data tab)? Maybe it is corrupt.
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
10-10-2022 07:31 AM
@Werner Stinckens: Same error occurs in the explorer (so in the Data tab).
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
10-11-2022 01:23 AM
OK, that means the table/view does not correspond anymore to the actual data.
Can you create a temp view on the delta lake/parquet/... and run the same query?
It will probably work. And if it does, you will have to re-create the table to have a correct schema.
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
10-14-2022 07:06 AM
I re-created the view and got a similar error. It looks like when the complex data type slightly changes (columns like "documents" in my original post), Spark view cannot handle that, which is not right.
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
10-14-2022 04:31 AM
Just read parquet in python and retrieve auto-generated DDL schema:
parquetFile = spark.read.parquet("people.parquet")
parquetFile.createOrReplaceTempView("parquetFile")
schema_json = spark.sql("SELECT * FROM parquetFile").schema.json()
ddl = spark.sparkContext._jvm.org.apache.spark.sql.types.DataType.fromJson(schema_json).toDDL()
print(ddl)
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
10-14-2022 12:33 PM
Thank you for chiming in @Hubert Dudek, but not sure how this helps with getting the view to read its parquet files when some of them have columns with structs and arrays with varying schemas, which is what appears to be the problem.
P.S.
I tried converting complex data types to strings, struct or arrays schema inside the column disappears leaving just the data values without their names.
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
10-20-2022 05:35 AM
It is helping in the way that it will give you the perfect schema 🙂 which you can copy paste to your code
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
10-27-2022 06:39 PM
Thanks again.
- I need to use Spark Persistent view. I don't think there is a way to specify the schema when creating Spark Persistent view, perhaps I don't know.
- Copy/paste schema into my code is hard-coding, which will not work in my case. I need the schema determined dynamically as new partitions are added that the view sits on top of. Sometimes fields within a complex (datatype) column have no data, which apparently changes the schema of the entire complex column. Somehow, I need to get the Spark Persistent view to recognize that and handle this dynamically rather than failing.
- Querying the same parquet files without a Spark Persistent view yields the same error. Flattening out columns with complex data type fields into separate simple data type columns takes care of the problem.
![](/skins/images/1C7D039E274DA4E433FB1B1A3EAE173A/responsive_peak/images/icon_anonymous_profile.png)
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
11-13-2022 11:04 PM
Hi @Michael Okulik
Hope all is well! Just wanted to check in if you were able to resolve your issue and would you be happy to share the solution or mark an answer as best? Else please let us know if you need more help.
We'd love to hear from you.
Thanks!
![](/skins/images/B38AF44D4BD6CE643D2A527BE673CCF6/responsive_peak/images/icon_anonymous_message.png)
![](/skins/images/B38AF44D4BD6CE643D2A527BE673CCF6/responsive_peak/images/icon_anonymous_message.png)