cancel
Showing results for 
Search instead for 
Did you mean: 
Data Engineering
Join discussions on data engineering best practices, architectures, and optimization strategies within the Databricks Community. Exchange insights and solutions with fellow data engineers.
cancel
Showing results for 
Search instead for 
Did you mean: 

Understanding and loading SQL Server Temp Tables from Databricks

ranged_coop
Valued Contributor II

Hi everyone...

Came across this question in Stackoverflow and wanted to try my hand in trying it. Unfortunately I have not been able to fix it...

https://stackoverflow.com/questions/78953930/create-and-load-sql-server-temp-table-table-or-table-fr...

Some of the observations for your consideration...

1. I was able to create the temp table both (`#table` and `##table`) within a pyodbc code block. It is available within that function but not accessible outside of it.

2. Using `prepareQuery` I am able to create the temp table in both read and write, but again it is only available within that read/write command i.e. `spark.read.format(jdbc)` and not outside of it.

3. Directly tried writing a dataframe using the `createTableOptions` and `createTableColumnTypes` options, but even that is not going anywhere.

Questions:

1. How do I create a SQL Server temp table (not databricks) and load it from a spark dataframe and have it accessible across the notebook.

2. What is the scope of a DB Connection in spark. Is it limited to just that one spark.read/spark.write command ? Is it just one connection per command or one connection per executor or core ?

3. Is there any way to ensure that the spark.write does not commit if there is a failure ? I have tried AutoCommit to False, but it is not working. I read somewhere that spark does indeed commit every n rows even in the case of a failure.

Useful Link:

https://spark.apache.org/docs/latest/sql-data-sources-jdbc.html

I hope brighter minds than mine can help me with the issue. 

Thank you very much...

0 REPLIES 0

Connect with Databricks Users in Your Area

Join a Regional User Group to connect with local Databricks users. Events will be happening in your city, and you won’t want to miss the chance to attend and share knowledge.

If there isn’t a group near you, start one and help create a community that brings people together.

Request a New Group