Using variables in Spark SQL
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
08-30-2022 02:38 AM
Is there a way to declare variables in Spark SQL like we do it in T-SQL?
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
08-30-2022 01:26 PM
Nice question. There was a discussion about the topic here https://community.databricks.com/s/question/0D53f00001c9RUYCA2/whats-the-equivalent-of-declare-in-da...
So there are functions, widgets, or just combining Python with SQL. Probably I would choose SQL functions as they are permanent and stay in metastore (but can not be used for everything).
CREATE OR REPLACE FUNCTION my_name(name STRING COMMENT 'my name')
RETURNS STRING
COMMENT 'just first name'
CONTAINS SQL DETERMINISTIC
RETURN name || 'Hubert'
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
09-07-2022 03:13 AM
Hi Hubert ,
As you have mentioned that it can not be used for everything , in my case also it doesn't suit as I have a lot variables declaration and having a function created for each variable doesn't look good.
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
08-30-2022 02:10 PM
Could you please follow the below link and let us know if this helps?

