cancel
Showing results forย 
Search instead forย 
Did you mean:ย 
Data Engineering
Join discussions on data engineering best practices, architectures, and optimization strategies within the Databricks Community. Exchange insights and solutions with fellow data engineers.
cancel
Showing results forย 
Search instead forย 
Did you mean:ย 

Table Fields Have a Different Value and Data Type in SQL Editor vs a SQL Notebook Cell

help_needed_445
Contributor

When I query a numeric field in the SQL Editor it returns a value of 0.02875 and the data type is decimal but when I run the same query in a SQL notebook cell it returns 0.0287500 and decimal(7,7). I'm assuming this is expected behavior but is there any documentation on this? I found after looking at the table schema that 0.0287500 and decimal(7,7) are correct but it would be nice to have confirmation because I can't find info on this difference anywhere on the web. Is the notebook returning the raw table value and data type?

SQL Editor

help_needed_445_0-1756930330991.png

SQL notebook cell

help_needed_445_1-1756930339286.png

 

 

1 ACCEPTED SOLUTION

Accepted Solutions

Khaja_Zaffer
Contributor III

Hello @help_needed_445 

Good day!

its very indeed interesting case study!

I found below from LLM models. 

Yes, this difference in decimal display between the Databricks SQL Editor (which uses the Photon engine in Databricks SQL) and notebooks (which use Spark SQL) is expected behavior related to how each interface handles formatting and schema inference for decimal types, particularly when dealing with underlying storage formats like Parquet in Delta tables.
 
After this,  I did create a table on MYSQL to check what is RDBMS is behaving, interstingly MySQL gave the solution as expected. 

Khaja_Zaffer_0-1756943416899.png

What I understand is from LLM models, SQL Editor uses photon engine, if its true then it is having a different design, and the output is different. 

However, even LLM models failed to share any documentation link for reference on this. 

can we have anyone check this on interactive cluster with photon acc. enabled. ( just to check may be I am not sure on this)

Thank you. 
I am open to other solutions, it should be interesting to know this thing. :))

View solution in original post

3 REPLIES 3

Khaja_Zaffer
Contributor III

Hello @help_needed_445 

Good day!

its very indeed interesting case study!

I found below from LLM models. 

Yes, this difference in decimal display between the Databricks SQL Editor (which uses the Photon engine in Databricks SQL) and notebooks (which use Spark SQL) is expected behavior related to how each interface handles formatting and schema inference for decimal types, particularly when dealing with underlying storage formats like Parquet in Delta tables.
 
After this,  I did create a table on MYSQL to check what is RDBMS is behaving, interstingly MySQL gave the solution as expected. 

Khaja_Zaffer_0-1756943416899.png

What I understand is from LLM models, SQL Editor uses photon engine, if its true then it is having a different design, and the output is different. 

However, even LLM models failed to share any documentation link for reference on this. 

can we have anyone check this on interactive cluster with photon acc. enabled. ( just to check may be I am not sure on this)

Thank you. 
I am open to other solutions, it should be interesting to know this thing. :))

Thanks for your response.

I also asked an LLM and got a lot of information I couldn't find via google or the databricks website. It said basically that the SQL editor performs certain actions to show a more "user-friendly" value.

For example:
1. Showing a simplified data type label
2. Trimming trailing zeros for display

There are probably others but I think the big idea is that a notebook should probably be used to get the actual values that are in a table.

You are right! 

But I believe it would be much more than user-friendly internally. 

Join Us as a Local Community Builder!

Passionate about hosting events and connecting people? Help us grow a vibrant local communityโ€”sign up today to get started!

Sign Up Now