cancel
Showing results for 
Search instead for 
Did you mean: 
Data Engineering
cancel
Showing results for 
Search instead for 
Did you mean: 

Simba ODBC datetime with millisecond overflows

158808
New Contributor II

Hello,

Using odbc 2.6.24.1041-2 for Linux, when inserting rows with milliseconds precision date (e.g. 2022-07-03 13:57:48.500) precision I get:

2022/07/03 14:41:19 SQLExecute: {22008} [Simba][Support] (40520) Datetime field overflow resulting from invalid datetime.

With Scala works fine:

spark.sql("INSERT INTO ... () VALUES ('2022-07-03 13:57:48.500')");

Is there a way to add millisecond precision with ODBC (or JDBC which I've not tried yet)?

Thank you,

Nikos

1 ACCEPTED SOLUTION

Accepted Solutions

Kaniz
Community Manager
Community Manager

Hi @Nikos Mitsis​, This generally happens when the date format is wrong in the data.

View solution in original post

5 REPLIES 5

Kaniz
Community Manager
Community Manager

Hi @Nikos Mitsis​, This generally happens when the date format is wrong in the data.

Prabakar
Esteemed Contributor III
Esteemed Contributor III

Hi @Nikos Mitsis​ have you tried `CAST()`

Kaniz
Community Manager
Community Manager

Hi @Nikos Mitsis​, We haven’t heard from you on the last response from @Prabakar​ and me​, and I was checking back to see if our suggestions helped you. Or else, If you have any solution, please do share that with the community as it can be helpful to others.

Also, Please don't forget to click on the "Select As Best" button whenever the information provided helps resolve your question.

158808
New Contributor II

I was passing a string (e.g. '2022-07-03 13:57:48.500') to the Golang SQL driver which is not working if the ms part is specified, but otherwise it works (e.g. '2022-07-03 13:57:48'). Passing a native Golang time.Time seems to work for timestamps with a ms part too.

Thank you for your help.

Kaniz
Community Manager
Community Manager

Hi @Nikos Mitsis​, Thank you for sharing your update on the workaround.

Appreciate your attempt to choose the "BEST" answer for us.

Welcome to Databricks Community: Lets learn, network and celebrate together

Join our fast-growing data practitioner and expert community of 80K+ members, ready to discover, help and collaborate together while making meaningful connections. 

Click here to register and join today! 

Engage in exciting technical discussions, join a group with your peers and meet our Featured Members.