Dooley
Databricks Employee
Databricks Employee

Explode - Does this code below give you the same error?

from pyspark.sql import functions as F
from pyspark.sql import Row
eDF = spark.createDataFrame([Row(a=1, intlist=[1,2,3], mapfield={"a": "b"})])
eDF.select(F.explode(eDF.intlist).alias("anInt")).show()

For the SQL method, what is the column name in the table that holds this JSON structure in each row? Let's say that it is "contacts" and yet your JSON starts the nesting with "contacts" then it would be:

SELECT contacts:contacts.emails[*].emailId FROM table_name

Same error with this?