cancel
Showing results for 
Search instead for 
Did you mean: 
Data Engineering
cancel
Showing results for 
Search instead for 
Did you mean: 

Error when writing dataframe to s3 location using PySpark

Constantine
Contributor III

I get an error when writing dataframe to s3 location

Found invalid character(s) among " ,;{}()\n\t=" in the column names of your

I have gone through all the columns and none of them have any special characters. Any idea how to fix this?

4 REPLIES 4

Kaniz
Community Manager
Community Manager

Hi @John Constantine​, Can you please share the script with us?

Emilie
New Contributor II

I got this error when I was running a query given to me, and the author didn't have aliases on aggregates. Something like:

sum(dollars_spent)

needed an alias:

sum(dollars_spent) as sum_dollars_spent

Kaniz
Community Manager
Community Manager

Hi @John Constantine​, We haven’t heard from you on the last response from @Emilie Myth​ , and I was checking back to see if you have a resolution yet. If you have any solution, please share it with the community as it can be helpful to others. Otherwise, we will respond with more details and try to help.

Hi @John Constantine​,

Just a friendly follow-up. Do you still need help or you were able to find the solution to this issue? please let us know.

Welcome to Databricks Community: Lets learn, network and celebrate together

Join our fast-growing data practitioner and expert community of 80K+ members, ready to discover, help and collaborate together while making meaningful connections. 

Click here to register and join today! 

Engage in exciting technical discussions, join a group with your peers and meet our Featured Members.