- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
05-11-2023 12:07 AM
I am trying to read the changes data from snowflake query into the dataframe using Databricks.
Same query is working in snowflake but not in Databricks. Both sides timezones and format are same for the timestamp. I am trying to implement changetracking feature from below query. No problem in snowflake but in databricks issue is coming.
Below is my query and error.
SELECT * FROM TestTable CHANGES(INFORMATION => DEFAULT) AT(TIMESTAMP => '2023-05-03 00:43:34.885 -7000')
Error:
Time travel data is not available for table TestTable. The requested time is either beyond the allowed time travel period or before the object creation time.
Please help.
- Labels:
-
databricks
-
Pyspark Dataframe
Accepted Solutions
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
05-12-2023 10:33 AM
@SkvAdi v based on error it looks time provided seems to in correct, provided time stamp might be before object created date time. can you please check when table created and provide and test
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
05-12-2023 10:33 AM
@SkvAdi v based on error it looks time provided seems to in correct, provided time stamp might be before object created date time. can you please check when table created and provide and test
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
06-23-2023 08:18 AM
you are format is wrong that's why you got an error
try this
SELECT * FROM TestTable CHANGES(INFORMATION => DEFAULT)
AT(TIMESTAMP => TO_TIMESTAMP_TZ('2023-05-03 00:43:34.885','YYYY-MM-DD HH24:MI:SS.FF'))