<?xml version="1.0" encoding="UTF-8"?>
<rss xmlns:content="http://purl.org/rss/1.0/modules/content/" xmlns:dc="http://purl.org/dc/elements/1.1/" xmlns:rdf="http://www.w3.org/1999/02/22-rdf-syntax-ns#" xmlns:taxo="http://purl.org/rss/1.0/modules/taxonomy/" version="2.0">
  <channel>
    <title>topic Re: How can I call a stored procedure in Spark Sql? in Data Engineering</title>
    <link>https://community.databricks.com/t5/data-engineering/how-can-i-call-a-stored-procedure-in-spark-sql/m-p/29414#M21142</link>
    <description>&lt;P&gt;&lt;/P&gt;
&lt;P&gt;you can use User Defined function&lt;/P&gt; 
&lt;P&gt;&lt;/P&gt;</description>
    <pubDate>Mon, 29 Oct 2018 21:28:22 GMT</pubDate>
    <dc:creator>xsobh</dc:creator>
    <dc:date>2018-10-29T21:28:22Z</dc:date>
    <item>
      <title>How can I call a stored procedure in Spark Sql?</title>
      <link>https://community.databricks.com/t5/data-engineering/how-can-i-call-a-stored-procedure-in-spark-sql/m-p/29412#M21140</link>
      <description>&lt;P&gt;&lt;/P&gt;
&lt;P&gt;I have seen the following code:&lt;/P&gt;
&lt;PRE&gt;&lt;CODE&gt;val url =
   "jdbc:mysql://yourIP:yourPort/test?
   user=yourUsername; password=yourPassword"
   val df = sqlContext
   .read
   .format("jdbc")
   .option("url", url)
   .option("dbtable", "people")
   .load()&lt;/CODE&gt;&lt;/PRE&gt;
&lt;P&gt;But I need to run a stored procedure. When I use &lt;PRE&gt;&lt;CODE&gt;exec&lt;/CODE&gt;&lt;/PRE&gt; command for the &lt;PRE&gt;&lt;CODE&gt;dbtable&lt;/CODE&gt;&lt;/PRE&gt; option above, it gives me this error:&lt;/P&gt;
&lt;P&gt; 
 &lt;/P&gt;&lt;P&gt;com.microsoft.sqlserver.jdbc.SQLServerException: Incorrect syntax near the keyword 'exec'.&lt;/P&gt;
 
&lt;P&gt;&lt;/P&gt;</description>
      <pubDate>Thu, 10 Nov 2016 16:44:22 GMT</pubDate>
      <guid>https://community.databricks.com/t5/data-engineering/how-can-i-call-a-stored-procedure-in-spark-sql/m-p/29412#M21140</guid>
      <dc:creator>mashaye</dc:creator>
      <dc:date>2016-11-10T16:44:22Z</dc:date>
    </item>
    <item>
      <title>Re: How can I call a stored procedure in Spark Sql?</title>
      <link>https://community.databricks.com/t5/data-engineering/how-can-i-call-a-stored-procedure-in-spark-sql/m-p/29413#M21141</link>
      <description>&lt;P&gt;&lt;/P&gt;
&lt;P&gt;Hi.&lt;/P&gt;
&lt;P&gt;From the docs &lt;/P&gt;
&lt;B&gt;The JDBC table that should be read. Note that anything that is valid in a &lt;/B&gt;&lt;PRE&gt;&lt;B&gt;&lt;CODE&gt;FROM&lt;/CODE&gt;&lt;/B&gt;&lt;/PRE&gt; clause of a SQL query can be used. For example, instead of a full table you could also use a subquery in parentheses.
&lt;P&gt;&lt;/P&gt;
&lt;P&gt;So has to be a subquery or alternatively you can use table functions and to achieve the same as a stored procedure.&lt;/P&gt; 
&lt;P&gt;&lt;/P&gt;</description>
      <pubDate>Mon, 29 Oct 2018 11:09:00 GMT</pubDate>
      <guid>https://community.databricks.com/t5/data-engineering/how-can-i-call-a-stored-procedure-in-spark-sql/m-p/29413#M21141</guid>
      <dc:creator>ShaunRyan1</dc:creator>
      <dc:date>2018-10-29T11:09:00Z</dc:date>
    </item>
    <item>
      <title>Re: How can I call a stored procedure in Spark Sql?</title>
      <link>https://community.databricks.com/t5/data-engineering/how-can-i-call-a-stored-procedure-in-spark-sql/m-p/29414#M21142</link>
      <description>&lt;P&gt;&lt;/P&gt;
&lt;P&gt;you can use User Defined function&lt;/P&gt; 
&lt;P&gt;&lt;/P&gt;</description>
      <pubDate>Mon, 29 Oct 2018 21:28:22 GMT</pubDate>
      <guid>https://community.databricks.com/t5/data-engineering/how-can-i-call-a-stored-procedure-in-spark-sql/m-p/29414#M21142</guid>
      <dc:creator>xsobh</dc:creator>
      <dc:date>2018-10-29T21:28:22Z</dc:date>
    </item>
    <item>
      <title>Re: How can I call a stored procedure in Spark Sql?</title>
      <link>https://community.databricks.com/t5/data-engineering/how-can-i-call-a-stored-procedure-in-spark-sql/m-p/29415#M21143</link>
      <description>&lt;P&gt;&lt;/P&gt;
&lt;P&gt;This doesn't seem to be supported. There is an alternative but requires using pyodbc and adding to your init script. Details can be found here:&lt;/P&gt;
&lt;P&gt;&lt;A href="https://datathirst.net/blog/2018/10/12/executing-sql-server-stored-procedures-on-databricks-pyspark" target="test_blank"&gt;https://datathirst.net/blog/2018/10/12/executing-sql-server-stored-procedures-on-databricks-pyspark&lt;/A&gt;&lt;/P&gt;
&lt;P&gt;I have tested this myself and works fine. If anyone has any alternative methods please let me know.&lt;/P&gt; 
&lt;P&gt;&lt;/P&gt;</description>
      <pubDate>Tue, 04 Jun 2019 03:34:16 GMT</pubDate>
      <guid>https://community.databricks.com/t5/data-engineering/how-can-i-call-a-stored-procedure-in-spark-sql/m-p/29415#M21143</guid>
      <dc:creator>j500sut</dc:creator>
      <dc:date>2019-06-04T03:34:16Z</dc:date>
    </item>
    <item>
      <title>Re: How can I call a stored procedure in Spark Sql?</title>
      <link>https://community.databricks.com/t5/data-engineering/how-can-i-call-a-stored-procedure-in-spark-sql/m-p/29416#M21144</link>
      <description>&lt;P&gt;&lt;/P&gt;
&lt;P&gt;Hi, could you please elaborate? I understand that unles you bury some dynamic sql into a UDF then you can't do anything other than select data and return it.&lt;/P&gt;
&lt;P&gt;Chris&lt;/P&gt; 
&lt;P&gt;&lt;/P&gt;</description>
      <pubDate>Tue, 23 Jul 2019 08:22:38 GMT</pubDate>
      <guid>https://community.databricks.com/t5/data-engineering/how-can-i-call-a-stored-procedure-in-spark-sql/m-p/29416#M21144</guid>
      <dc:creator>ChristianBracch</dc:creator>
      <dc:date>2019-07-23T08:22:38Z</dc:date>
    </item>
    <item>
      <title>Re: How can I call a stored procedure in Spark Sql?</title>
      <link>https://community.databricks.com/t5/data-engineering/how-can-i-call-a-stored-procedure-in-spark-sql/m-p/29417#M21145</link>
      <description>&lt;P&gt;&lt;/P&gt;
&lt;P&gt;Thanks. I found this article also. I was concerned about it using driver mode and blocking all worker nodes. This sounds quite bad if you have many concurrent jobs running or need to call stored procs frequently. Are you still using this approach or did you find another approach?&lt;/P&gt; 
&lt;P&gt;&lt;/P&gt;</description>
      <pubDate>Tue, 23 Jul 2019 08:23:52 GMT</pubDate>
      <guid>https://community.databricks.com/t5/data-engineering/how-can-i-call-a-stored-procedure-in-spark-sql/m-p/29417#M21145</guid>
      <dc:creator>ChristianBracch</dc:creator>
      <dc:date>2019-07-23T08:23:52Z</dc:date>
    </item>
    <item>
      <title>Re: How can I call a stored procedure in Spark Sql?</title>
      <link>https://community.databricks.com/t5/data-engineering/how-can-i-call-a-stored-procedure-in-spark-sql/m-p/29418#M21146</link>
      <description>&lt;P&gt;&lt;/P&gt;
&lt;P&gt;Hi &lt;A href="https://users/31196/christian-bracchi.html" target="_blank"&gt;@Christian Bracchi&lt;/A&gt;, we're still using this approach at the moment and haven't experienced any issues so far. Although we only have one production job running at the moment!&lt;/P&gt; 
&lt;P&gt;&lt;/P&gt;</description>
      <pubDate>Fri, 16 Aug 2019 05:35:28 GMT</pubDate>
      <guid>https://community.databricks.com/t5/data-engineering/how-can-i-call-a-stored-procedure-in-spark-sql/m-p/29418#M21146</guid>
      <dc:creator>j500sut</dc:creator>
      <dc:date>2019-08-16T05:35:28Z</dc:date>
    </item>
  </channel>
</rss>

