- 9660 Views
- 5 replies
- 4 kudos
I am trying to read an external iceberg database from s3 location using the follwing commanddf_source = (spark.read.format("iceberg")
.load(source_s3_path)
.drop(*source_drop_columns)
.filter(f"{date_column}<='{date_filter}'")
)B...
- 9660 Views
- 5 replies
- 4 kudos
- 2314 Views
- 2 replies
- 3 kudos
I get an error when writing dataframe to s3 location Found invalid character(s) among " ,;{}()\n\t=" in the column names of yourI have gone through all the columns and none of them have any special characters. Any idea how to fix this?
- 2314 Views
- 2 replies
- 3 kudos
Latest Reply
I got this error when I was running a query given to me, and the author didn't have aliases on aggregates. Something like:sum(dollars_spent)needed an alias:sum(dollars_spent) as sum_dollars_spent
1 More Replies