โ11-07-2022 02:43 PM
I'm trying to do a select on a column with special characters in Databricks sql.
I've tried the following
%sql
select ex$col
from database.table
limit 10;
%sql
select `ex$col `
from database.table
limit 10;
They both don't work and will return "ex does not exist. Did you mean the following/ .... "
Is there a way to do this? or is the functionality not currently supported?
โ11-08-2022 08:21 PM
Many thanks for your reply.
Ah, we are creating the table using a delta location
s = f"create table {database}.{dataset_name} using delta location '{location}'"
spark.sql(s)
I can still query the special character using pyspark which good for me now, but a lot of our users will want to use sql. The only option seems to be to change the schema.
โ11-07-2022 07:33 PM
It might depend on how your table was created. I was able to get the below to work:
CREATE OR REPLACE TABLE database.special_chars (`ex$col` int);
INSERT INTO database.special_chars VALUES (1);
SELECT `ex$col`
FROM database.special_chars
LIMIT 10;
โ11-08-2022 08:21 PM
Many thanks for your reply.
Ah, we are creating the table using a delta location
s = f"create table {database}.{dataset_name} using delta location '{location}'"
spark.sql(s)
I can still query the special character using pyspark which good for me now, but a lot of our users will want to use sql. The only option seems to be to change the schema.
โ03-23-2023 07:10 AM
@LandanG . The issue is with %sql .
โ11-20-2022 10:18 PM
Hi @Jamie Nathanโ
Hope all is well!
Just wanted to check in if you were able to resolve your issue and would you be happy to share the solution or mark an answer as best? Else please let us know if you need more help.
We'd love to hear from you.
Thanks!
Passionate about hosting events and connecting people? Help us grow a vibrant local communityโsign up today to get started!
Sign Up Now