ALIAS Not accepted 42601
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
3 weeks ago
I am unable to run the following query generated from my backend at databricks side
Query:
SELECT "A".`cut` AS "Cut" , "A".`color` AS "Color" , "A".`carat` AS "Carat" , "A".`clarity` AS "Clarity" FROM databricksconnect.default.diamonds "A"
Error logs:
Caused by: com.pega.pegarules.pub.database.DatabaseException: Unable to query the database: code: 500051 SQLState: 42601 Message: [Databricks][JDBCDriver](500051) ERROR processing query/statement. Error Code: 0, SQL state: 42601, Query: SELECT "A"***, Error message from Server: org.apache.hive.service.cli.HiveSQLException: Error running query: [PARSE_SYNTAX_ERROR] org.apache.spark.sql.catalyst.parser.ParseException:
[PARSE_SYNTAX_ERROR] Syntax error at or near '"Cut"'. SQLSTATE: 42601 (line 1, pos 20)
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
3 weeks ago - last edited 3 weeks ago
Hi there!
aliases must be strings. If your column alias contains space, special character, or is a number, then grab it within a backticks. See example below:
select
order_id, -- No Alias - Works
order_id as ORDER_ID, -- String - Works
-- order_id as "ORDER_ID", -- Double quotes - Syntax Error
-- order_id as 'orderid', -- Single quotes - Syntax Error
-- order_id as _@+$, -- Specia chars - Syntax Error
order_id as `_@+$` -- Backticks - Works
from my_view
Output :
Table alias should have been used without quotes as well 🙂
Cheers!
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
3 weeks ago
Is there any way or workaround to it so that it accepts double quotes for aliases, as sql and postgres does allow the same?
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
3 weeks ago
None that I'm aware of. You could handle this on the fly if there' s a possibility
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
3 weeks ago
Hi @malhm ,
Double quotes are not supported in column alias. In Databricks SQL/Spark SQL one uses backticks instead of double quotes like in PostgreSQL.
Check the docs:
https://spark.apache.org/docs/3.5.1/sql-ref-identifier.html
![](/skins/images/B38AF44D4BD6CE643D2A527BE673CCF6/responsive_peak/images/icon_anonymous_message.png)
![](/skins/images/B38AF44D4BD6CE643D2A527BE673CCF6/responsive_peak/images/icon_anonymous_message.png)