cancel
Showing results for 
Search instead for 
Did you mean: 
Data Engineering
cancel
Showing results for 
Search instead for 
Did you mean: 

Error in SQL statement: AnalysisException: Cannot up cast documents from array

sage5616
Valued Contributor

Hi Everyone,

I am getting the following error when running a SQL query and do not understand what it means or what can be done to resolve it. Any recommendations?

View DDL:

CREATE VIEW myschema.table (
  accountId,
  agreementType,
  capture_file_name,
  capture_file_path,
  createdDate,
  currency,
  de_agentid,
  de_applicationshardid,
  de_datacontenttype,
  de_eventapplicationtime,
  de_eventmode,
  de_eventpublishtime,
  de_eventsequenceid,
  de_id,
  de_partitionkey,
  de_source,
  documents,
  effectiveDate,
  eh_EnqueuedTimeUtc,
  eh_Offset,
  eh_SequenceNumber,
  eh_SystemProperties_x_opt_enqueued_time,
  eh_SystemProperties_x_opt_kafka_key,
  endDate,
  expirationDate,
  externalId,
  externalSource,
  id,
  isInWorkflow,
  isSigned,
  name,
  notice,
  noticeDate,
  noticePeriod,
  parties,
  reconciled_file_name_w_path,
  requestor,
  resourceVersion,
  status,
  terminateForConvenience,
  updatedDate,
  value,
  de_action,
  de_eventapplication_year,
  de_eventapplication_month,
  de_eventapplication_day,
  de_eventapplication_hour,
  de_eventapplication_minute)
TBLPROPERTIES (
  'transient_lastDdlTime' = '1664473495')
AS select * from parquet.`/mnt/store/de_entitytype=Agreement`

SQL query:

select de_id from myschema.table;

Error:

Error in SQL statement: AnalysisException: Cannot up cast documents from array<struct<accountId:string,agreementId:string,createdBy:string,createdDate:string,id:string,obligations:array<string>,resourceVersion:bigint,updatedBy:string,updatedDate:string>> to array<string>.
The type path of the target object is:
 
You can either add an explicit cast to the input data or choose a higher precision type of the field in the target object

Any recommendations on how to resolve this error would be much appreciated 🙂

12 REPLIES 12

-werners-
Esteemed Contributor III

Seems like the nested struct gives you some issues.

Can you check this to see if it helps?

sage5616
Valued Contributor

Thank you @Werner Stinckens​ , but it does not help much. The article you've linked explains new features of Databricks Runtime 3.0 for working with complex data types.

I am simply selecting a single column with a simple data type from a view that also has a column with complex data type. This query should not be failing like that. I hope this makes sense.

-werners-
Esteemed Contributor III

Sorry, I missed the select query.

Can you check if the view returns data when using the explorer (so in the Data tab)? Maybe it is corrupt.

sage5616
Valued Contributor

@Werner Stinckens​: Same error occurs in the explorer (so in the Data tab).

-werners-
Esteemed Contributor III

OK, that means the table/view does not correspond anymore to the actual data.

Can you create a temp view on the delta lake/parquet/... and run the same query?

It will probably work. And if it does, you will have to re-create the table to have a correct schema.

sage5616
Valued Contributor

I re-created the view and got a similar error. It looks like when the complex data type slightly changes (columns like "documents" in my original post), Spark view cannot handle that, which is not right.

Hubert-Dudek
Esteemed Contributor III

Just read parquet in python and retrieve auto-generated DDL schema:

parquetFile = spark.read.parquet("people.parquet")
parquetFile.createOrReplaceTempView("parquetFile")
schema_json = spark.sql("SELECT * FROM parquetFile").schema.json()
ddl = spark.sparkContext._jvm.org.apache.spark.sql.types.DataType.fromJson(schema_json).toDDL()
print(ddl)

Thank you for chiming in @Hubert Dudek​, but not sure how this helps with getting the view to read its parquet files when some of them have columns with structs and arrays with varying schemas, which is what appears to be the problem.

P.S.

I tried converting complex data types to strings, struct or arrays schema inside the column disappears leaving just the data values without their names.

Hubert-Dudek
Esteemed Contributor III

It is helping in the way that it will give you the perfect schema 🙂 which you can copy paste to your code

sage5616
Valued Contributor

Thanks again.

  1. I need to use Spark Persistent view. I don't think there is a way to specify the schema when creating Spark Persistent view, perhaps I don't know.
  2. Copy/paste schema into my code is hard-coding, which will not work in my case. I need the schema determined dynamically as new partitions are added that the view sits on top of. Sometimes fields within a complex (datatype) column have no data, which apparently changes the schema of the entire complex column. Somehow, I need to get the Spark Persistent view to recognize that and handle this dynamically rather than failing.
  3. Querying the same parquet files without a Spark Persistent view yields the same error. Flattening out columns with complex data type fields into separate simple data type columns takes care of the problem.

Kaniz
Community Manager
Community Manager

Hi @Michael Okulik​ , We haven’t heard from you since the last response from @Hubert Dudek​, and I was checking back to see if you have a resolution yet.

If you have any solution, please share it with the community as it can be helpful to others. Otherwise, we will respond with more details and try to help.

Also, Please don't forget to click on the "Select As Best" button whenever the information provided helps resolve your question.

Anonymous
Not applicable

Hi @Michael Okulik​ 

Hope all is well! Just wanted to check in if you were able to resolve your issue and would you be happy to share the solution or mark an answer as best? Else please let us know if you need more help. 

We'd love to hear from you.

Thanks!

Welcome to Databricks Community: Lets learn, network and celebrate together

Join our fast-growing data practitioner and expert community of 80K+ members, ready to discover, help and collaborate together while making meaningful connections. 

Click here to register and join today! 

Engage in exciting technical discussions, join a group with your peers and meet our Featured Members.