11-07-2022 02:43 PM
I'm trying to do a select on a column with special characters in Databricks sql.
I've tried the following
%sql
select ex$col
from database.table
limit 10;
%sql
select `ex$col `
from database.table
limit 10;
They both don't work and will return "ex does not exist. Did you mean the following/ .... "
Is there a way to do this? or is the functionality not currently supported?
11-08-2022 08:21 PM
Many thanks for your reply.
Ah, we are creating the table using a delta location
s = f"create table {database}.{dataset_name} using delta location '{location}'"
spark.sql(s)
I can still query the special character using pyspark which good for me now, but a lot of our users will want to use sql. The only option seems to be to change the schema.
11-07-2022 07:33 PM
It might depend on how your table was created. I was able to get the below to work:
CREATE OR REPLACE TABLE database.special_chars (`ex$col` int);
INSERT INTO database.special_chars VALUES (1);
SELECT `ex$col`
FROM database.special_chars
LIMIT 10;
11-08-2022 08:21 PM
Many thanks for your reply.
Ah, we are creating the table using a delta location
s = f"create table {database}.{dataset_name} using delta location '{location}'"
spark.sql(s)
I can still query the special character using pyspark which good for me now, but a lot of our users will want to use sql. The only option seems to be to change the schema.
03-23-2023 07:10 AM
@LandanG . The issue is with %sql .
11-07-2022 09:56 PM
Hi @Jamie Nathan , We haven’t heard from you since @Landan George 's last response, and I was checking back to see if you have a resolution yet.
If you have any solution, please share it with the community, as it can be helpful to others. Otherwise, we will respond with more details and try to help.
Also, Please don't forget to click on the "Select As Best" button whenever the information provided helps resolve your question.
11-20-2022 10:18 PM
Hi @Jamie Nathan
Hope all is well!
Just wanted to check in if you were able to resolve your issue and would you be happy to share the solution or mark an answer as best? Else please let us know if you need more help.
We'd love to hear from you.
Thanks!
Join our fast-growing data practitioner and expert community of 80K+ members, ready to discover, help and collaborate together while making meaningful connections.
Click here to register and join today!
Engage in exciting technical discussions, join a group with your peers and meet our Featured Members.