by
Dinu2
• New Contributor III
- 6303 Views
- 7 replies
- 5 kudos
Timestamp columns which are extracted from source databases using jdbc read are getting converted to different timezone and is not matching with source timestamp. Could anyone suggest how can we get same timestamp data like source data?
- 6303 Views
- 7 replies
- 5 kudos
Latest Reply
Hi @Dinu Sukumara We haven't heard from you since the last response from @Werner Stinckens . Kindly share the information with us, and in return, we will provide you with the necessary solution.Thanks and Regards
6 More Replies
by
JesseS
• New Contributor II
- 6797 Views
- 2 replies
- 1 kudos
Here is the situation I am working with. I am trying to extract source data using Databricks JDBC connector using SQL Server databases as my data source. I want to write those into a directory in my data lake as JSON files, then have AutoLoader ing...
- 6797 Views
- 2 replies
- 1 kudos
Latest Reply
To add to @werners point, I would use ADF to load SQL server data into ADLS Gen 2 as json. Then Load these Raw Json files from your ADLS base location into a Delta table using Autoloader.Delta Live Tables can be used in this scenario.You can also reg...
1 More Replies
- 636 Views
- 0 replies
- 0 kudos
data=[['x', 20220118, 'FALSE', 3],['x', 20220118, 'TRUE', 97],['x', 20220119, 'FALSE', 1],['x', 20220119, 'TRUE', 49],['Y', 20220118, 'FALSE', 100],['Y', 20220118, 'TRUE', 900],['Y', 20220119, 'FALSE', 200],['Y', 20220119, 'TRUE', 800]]df=spark.creat...
- 636 Views
- 0 replies
- 0 kudos