10-06-2022 08:13 AM
Hello,
I am getting inconsistent representation of long types.
1661817599972 is the unix timestamp in milliseconds for Monday, August 29, 2022 11:59:59.972 PM GMT
when I execute:
`select 1661817599972 as t`
the result is:
166181759997 (last digit truncated)
The number is correctly cast as a bigint according to `describe select 1661817599972 as t`.
However when I execute:
select timestamp_millis(1661817599972) as t
the result is (correctly) `2022-08-29T23:59:59.972+0000`. So the internal representation of the object seems correct. This appears to happen for bigints only. For instance:
`select 16618175678 as t`
yields:
1661817567
but
`select 1661817567 as t`
also yields
1661817567
DBR 11.2
Thank you,
Cosimo.
PS: displays correctly in DBSQL
10-14-2022 05:33 AM
10-14-2022 05:46 AM
10-14-2022 05:49 AM
10-20-2022 05:22 AM
Hi, I tried again on another cluster but still working ok. Maybe try to change the runtime version. Additionally, you can check in Spark UI -> Environment -> Spark properties. Perhaps you have there some options set that change that behavior. I tried to find now in the options something related but couldn't.
10-25-2022 04:27 PM
Thank you both for following up, unfortunately I'm still having the issue. Seems like a notebook frontend issue to me...
@Hubert Dudek thanks for the suggestion, I will take a look.
Best,
Cosimo.
10-25-2022 04:33 PM
11-14-2022 12:58 AM
Hi @Cosimo Felline
Hope all is well!
Does @Hubert Dudek (Customer) response answer your question?
If you could resolve your issue, would you be happy to share the solution or mark an answer as best? Else please let us know if you need more help.
We'd love to hear from you.
Thanks!
Join a Regional User Group to connect with local Databricks users. Events will be happening in your city, and you won’t want to miss the chance to attend and share knowledge.
If there isn’t a group near you, start one and help create a community that brings people together.
Request a New Group