Error when writing dataframe to s3 location using PySpark
Options
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
06-02-2022 03:18 PM
I get an error when writing dataframe to s3 location
Found invalid character(s) among " ,;{}()\n\t=" in the column names of your
I have gone through all the columns and none of them have any special characters. Any idea how to fix this?
Labels:
- Labels:
-
Pyspark
-
S3 Location
2 REPLIES 2
Options
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
06-07-2022 06:23 PM
I got this error when I was running a query given to me, and the author didn't have aliases on aggregates. Something like:
sum(dollars_spent)
needed an alias:
sum(dollars_spent) as sum_dollars_spent
Options
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
07-28-2022 06:19 PM
Hi @John Constantine,
Just a friendly follow-up. Do you still need help or you were able to find the solution to this issue? please let us know.

