Many thanks for your reply.
Ah, we are creating the table using a delta location
s = f"create table {database}.{dataset_name} using delta location '{location}'"
spark.sql(s)
I can still query the special character using pyspark which good for me now, but a lot of our users will want to use sql. The only option seems to be to change the schema.