โ10-05-2022 12:45 PM
Hi Everyone,
I am getting the following error when running a SQL query and do not understand what it means or what can be done to resolve it. Any recommendations?
View DDL:
CREATE VIEW myschema.table (
accountId,
agreementType,
capture_file_name,
capture_file_path,
createdDate,
currency,
de_agentid,
de_applicationshardid,
de_datacontenttype,
de_eventapplicationtime,
de_eventmode,
de_eventpublishtime,
de_eventsequenceid,
de_id,
de_partitionkey,
de_source,
documents,
effectiveDate,
eh_EnqueuedTimeUtc,
eh_Offset,
eh_SequenceNumber,
eh_SystemProperties_x_opt_enqueued_time,
eh_SystemProperties_x_opt_kafka_key,
endDate,
expirationDate,
externalId,
externalSource,
id,
isInWorkflow,
isSigned,
name,
notice,
noticeDate,
noticePeriod,
parties,
reconciled_file_name_w_path,
requestor,
resourceVersion,
status,
terminateForConvenience,
updatedDate,
value,
de_action,
de_eventapplication_year,
de_eventapplication_month,
de_eventapplication_day,
de_eventapplication_hour,
de_eventapplication_minute)
TBLPROPERTIES (
'transient_lastDdlTime' = '1664473495')
AS select * from parquet.`/mnt/store/de_entitytype=Agreement`
SQL query:
select de_id from myschema.table;
Error:
Error in SQL statement: AnalysisException: Cannot up cast documents from array<struct<accountId:string,agreementId:string,createdBy:string,createdDate:string,id:string,obligations:array<string>,resourceVersion:bigint,updatedBy:string,updatedDate:string>> to array<string>.
The type path of the target object is:
You can either add an explicit cast to the input data or choose a higher precision type of the field in the target object
Any recommendations on how to resolve this error would be much appreciated ๐
โ10-06-2022 01:01 AM
Seems like the nested struct gives you some issues.
Can you check this to see if it helps?
โ10-06-2022 09:18 AM
Thank you @Werner Stinckensโ , but it does not help much. The article you've linked explains new features of Databricks Runtime 3.0 for working with complex data types.
I am simply selecting a single column with a simple data type from a view that also has a column with complex data type. This query should not be failing like that. I hope this makes sense.
โ10-10-2022 01:22 AM
Sorry, I missed the select query.
Can you check if the view returns data when using the explorer (so in the Data tab)? Maybe it is corrupt.
โ10-10-2022 07:31 AM
@Werner Stinckensโ: Same error occurs in the explorer (so in the Data tab).
โ10-11-2022 01:23 AM
OK, that means the table/view does not correspond anymore to the actual data.
Can you create a temp view on the delta lake/parquet/... and run the same query?
It will probably work. And if it does, you will have to re-create the table to have a correct schema.
โ10-14-2022 07:06 AM
I re-created the view and got a similar error. It looks like when the complex data type slightly changes (columns like "documents" in my original post), Spark view cannot handle that, which is not right.
โ10-14-2022 04:31 AM
Just read parquet in python and retrieve auto-generated DDL schema:
parquetFile = spark.read.parquet("people.parquet")
parquetFile.createOrReplaceTempView("parquetFile")
schema_json = spark.sql("SELECT * FROM parquetFile").schema.json()
ddl = spark.sparkContext._jvm.org.apache.spark.sql.types.DataType.fromJson(schema_json).toDDL()
print(ddl)
โ10-14-2022 12:33 PM
Thank you for chiming in @Hubert Dudekโ, but not sure how this helps with getting the view to read its parquet files when some of them have columns with structs and arrays with varying schemas, which is what appears to be the problem.
P.S.
I tried converting complex data types to strings, struct or arrays schema inside the column disappears leaving just the data values without their names.
โ10-20-2022 05:35 AM
It is helping in the way that it will give you the perfect schema ๐ which you can copy paste to your code
โ10-27-2022 06:39 PM
Thanks again.
โ11-13-2022 11:04 PM
Hi @Michael Okulikโ
Hope all is well! Just wanted to check in if you were able to resolve your issue and would you be happy to share the solution or mark an answer as best? Else please let us know if you need more help.
We'd love to hear from you.
Thanks!
Join a Regional User Group to connect with local Databricks users. Events will be happening in your city, and you wonโt want to miss the chance to attend and share knowledge.
If there isnโt a group near you, start one and help create a community that brings people together.
Request a New Group