โ12-20-2021 06:57 AM
I'm trying to connect to Databricks using pyodbc and I'm running into an issue with struct columns.
As far as I understand, struct columns and array columns are not supported by pyodbc, but they are converted to JSON. However, when there are nested columns that contain quotes (") , the JSON that is returned is invalid, i.e. the quotes are not escaped.
Dataset I used:
from pyspark.sql import Row
data = [
Row(nested=Row(nested_column="nested with a \" character")),
]
df = spark.sparkContext.parallelize(data).toDF()
(
df.write
.format("delta")
.mode("overwrite")
.option("path", "/delta/quotes_issue/")
.saveAsTable("default.quotes_issue")
)
Querying the data afterwards through pyodbc (using docs here)
conn = pyodbc.connect("DSN=Databricks_Cluster", autocommit=True)
cursor = conn.cursor()
cursor.execute("SELECT * FROM default.quotes_issue")
rows = cursor.fetchall()
for r in rows:
print(r)
This gives me the following invalid JSON output.
'{"nested_column":"nested with a " character"}'
I've tried with both the Databricks Compute cluster endpoint, as well as the newer SQL Endpoint. Also tried the Databricks SQL Connection for Python, but they all have the same problem.
Is there a way to configure Databricks, or the Simba driver somehow to escape quotes correctly?
โ12-22-2021 12:39 AM
Hi @Kaniz Fatmaโ ,
Thanks for the reply. I'm still curious to the answer for the question above, but I did manage to find a work around.
Work Around
Using the to_json method, you can let Spark convert all the complex columns into json, instead of relying on the SQL endpoint / Simba Driver to do this for you. If you want to completely automate the use of to_json, the describe table statement can help you find out which datatypes your columns have.
โ12-27-2021 04:21 PM
@Derk Crezeeโ - I learned something today. Apparently ODBC does not convert to JSON. There is no defined spec on how to return complex types, in fact that was added only in SQL 2016. That's exactly what you are running into!
End of history lesson ๐ Here's a fix:
SELECT to_json(nested)
FROM default.quotes_issue
This will return:
{"nested_column":"nested with a \" character"}
which is valid JSON.
โ12-22-2021 12:39 AM
Hi @Kaniz Fatmaโ ,
Thanks for the reply. I'm still curious to the answer for the question above, but I did manage to find a work around.
Work Around
Using the to_json method, you can let Spark convert all the complex columns into json, instead of relying on the SQL endpoint / Simba Driver to do this for you. If you want to completely automate the use of to_json, the describe table statement can help you find out which datatypes your columns have.
โ12-22-2021 05:03 AM
Hi @Kaniz Fatmaโ ,
My question wasn't related to connecting Databricks to SQL Server, but on an error I'm getting while connecting to Databricks using pyodbc.
Thanks,
Derk
โ12-22-2021 06:12 AM
Hi @Kaniz Fatmaโ , thanks for the reply. I think I misunderstood. So the suggestion is to install a number of package and update pyodbc on the Databricks cluster you are using? Will give it a try.
โ12-23-2021 02:48 AM
Thanks @Kaniz Fatma (Databricks)โ . Have tried adding above script as a startup script to the cluster configuration I'm using within databricks, but issue remains.
โ12-27-2021 04:21 PM
@Derk Crezeeโ - I learned something today. Apparently ODBC does not convert to JSON. There is no defined spec on how to return complex types, in fact that was added only in SQL 2016. That's exactly what you are running into!
End of history lesson ๐ Here's a fix:
SELECT to_json(nested)
FROM default.quotes_issue
This will return:
{"nested_column":"nested with a \" character"}
which is valid JSON.
โ01-03-2022 10:46 AM
Hey Bilal,
Thanks for your reply. This indeed works and is what I'm currently using as a work around (see one of the previous posts).
Thanks,
Derk
โ01-04-2022 02:53 AM
@Derk Crezeeโ sorry I missed that you already had a fix for this.
Join a Regional User Group to connect with local Databricks users. Events will be happening in your city, and you wonโt want to miss the chance to attend and share knowledge.
If there isnโt a group near you, start one and help create a community that brings people together.
Request a New Group