โ02-18-2015 01:26 PM
โ02-18-2015 01:26 PM
You can pass parameters/arguments to your SQL statements by programmatically creating the SQL string using Scala/Python and pass it to sqlContext.sql(string).
Here's an example using String formatting in Scala:val param = 100
sqlContext.sql(s"""SELECT * FROM table1 where param=$param""")Note the 's' in front of the first """. This lets you substitute $param's in a Scala string.
Here's an example using String formatting in Python:
param = 100
query = "SELECT * FROM table1 where param={}".format(param)sqlContext.sql(query)โ02-18-2015 01:26 PM
You can pass parameters/arguments to your SQL statements by programmatically creating the SQL string using Scala/Python and pass it to sqlContext.sql(string).
Here's an example using String formatting in Scala:val param = 100
sqlContext.sql(s"""SELECT * FROM table1 where param=$param""")Note the 's' in front of the first """. This lets you substitute $param's in a Scala string.
Here's an example using String formatting in Python:
param = 100
query = "SELECT * FROM table1 where param={}".format(param)sqlContext.sql(query)โ05-26-2016 12:23 PM
I tried the same with insert query like below
val a = 1207 val b = "amit" val c = 20 sqlContext.sql(s"""insert into table employee select t.* from (select $a, $b, $c) t""")
But, I got the following error
Exception in thread "main" org.apache.spark.sql.AnalysisException: cannot resolve 'amit' given input columns ;
โ07-07-2016 11:57 PM
How can we pass date parameter in python to spark-sql
โ09-03-2018 05:45 AM
Worked in my scenario. thank you ๐
โ06-08-2022 04:39 PM
Isn't it a bad practice to embed SQL parameters by string formatting?
โ08-03-2016 11:44 AM
query = 'SELECT * FROM dbsnp where rs_id="{}"'.format(rs_id)
Add single quotes around the query
,Surround the query in single quote
query = 'SELECT * FROM dbsnp where rs_id="{}"'.format(rs_id)
โ10-28-2019 11:47 AM
If you are on Python 3.6 or higher, you can even use f-strings:
query= f'SELECT * FROM dbsnp where rs_id={rs_id} where {key}="abc"'
โ10-22-2016 03:33 PM
how do you use the pass multiple paramters in Spark SQL?can yu please help me
โ11-06-2016 04:24 PM
@chandrasโ
The sting Interpolation for scala is below.
You can use it to build your query.
Similar exists for python.
1. Your parameters
val p1 = "('0001','0002','0003')"
val p2 = "chandras"
2. Build the query
val query = s"select * from df where id in $p1 and name = $p2"
3. Then you can query it.
val df1 = sqlContext.sql(query)
โ01-10-2017 04:21 AM
1 - option sqlContext.sql("use %s" % db)
2 - option
>>> db = "juliodb" >>> print(db) juliodb >>> query = 'use {}'.format(db) >>> sqlContext.sql(query) DataFrame[] >>> b = sqlContext.sql("show tables") >>> b.show() +---------+-----------+ |tableName|isTemporary| +---------+-----------+ | firsttb| false| +---------+-----------+
>>>
โ01-10-2017 04:32 AM
3 - option
>>> id = 2 >>> >>> sqlContext.sql('select from firsttb where a = {}'.format(id)) DataFrame[a: int, b: string] >>> result = sqlContext.sql('select from firsttb where a = {}'.format(id)) >>> result.show() +---+-----+ | a| b| +---+-----+ | 2|cesar| +---+-----+
>>>
โ03-26-2019 04:30 AM
all these solutions pass sql to python or scala... not passing Parma to sql.
โ09-17-2019 06:44 AM
you can use widget
1- Create widget
Python
dbutils.widgets.text("y",'3')
or
SQL
CREATE WIDGET TEXT y DEFAULT '3'
2- Access the current value of the widget
Python
dbutils.widgets.get("y")
or
SQL
SELECT * FROM test_table WHERE key=getArgument("y") --key=$y -- The old way of creating widgets in SQL queries with the $<parameter> syntax still works as before
For more information about widgets. please check this link
https://docs.databricks.com/user-guide/notebooks/widgets.html
โ11-04-2019 10:56 PM
You might want to take a look at Widgets https://docs.databricks.com/notebooks/widgets.html
Join our fast-growing data practitioner and expert community of 80K+ members, ready to discover, help and collaborate together while making meaningful connections.
Click here to register and join today!
Engage in exciting technical discussions, join a group with your peers and meet our Featured Members.