cancel
Showing results forย 
Search instead forย 
Did you mean:ย 
Get Started Discussions
Start your journey with Databricks by joining discussions on getting started guides, tutorials, and introductory topics. Connect with beginners and experts alike to kickstart your Databricks experience.
cancel
Showing results forย 
Search instead forย 
Did you mean:ย 

validating record count at SQL server database tabbles with migrated azure data lake gen2

sai_sathya
New Contributor III

we are migrating out project from on-premise to azure , so on-premise database is the SQL server that we are using and azure data lake gen2 is the storage location where store data currently and so far we are currently validating record count of each table manually from sql server database tables and similarly we write pyspark code in databricks to write those data as parquet file and we validate record count from pyspark manually every time which is time consuming

is that possible to execute this process to make it automated in order to save time ?

can this be done by using pyspark code or is there any other solution?

1 REPLY 1

how much i ever tried doing that, im using my local system for checking and there are no firewalls or any security blocks this is the error message that i keep recieving and unable to fix it  : 
com.microsoft.sqlserver.jdbc.SQLServerException: The TCP/IP connection to the host SATHYA, port 1433 has failed. Error: "SATHYA. Verify the connection properties. Make sure that an instance of SQL Server is running on the host and accepting TCP/IP connections at the port. Make sure that TCP connections to the port are not blocked by a firewall.".

Join Us as a Local Community Builder!

Passionate about hosting events and connecting people? Help us grow a vibrant local communityโ€”sign up today to get started!

Sign Up Now