cancel
Showing results forย 
Search instead forย 
Did you mean:ย 
Community Platform Discussions
Connect with fellow community members to discuss general topics related to the Databricks platform, industry trends, and best practices. Share experiences, ask questions, and foster collaboration within the community.
cancel
Showing results forย 
Search instead forย 
Did you mean:ย 

validating record count at SQL server database tabbles with migrated azure data lake gen2

sai_sathya
New Contributor III

we are migrating out project from on-premise to azure , so on-premise database is the SQL server that we are using and azure data lake gen2 is the storage location where store data currently and so far we are currently validating record count of each table manually from sql server database tables and similarly we write pyspark code in databricks to write those data as parquet file and we validate record count from pyspark manually every time which is time consuming

is that possible to execute this process to make it automated in order to save time ?

can this be done by using pyspark code or is there any other solution?

1 REPLY 1

how much i ever tried doing that, im using my local system for checking and there are no firewalls or any security blocks this is the error message that i keep recieving and unable to fix it  : 
com.microsoft.sqlserver.jdbc.SQLServerException: The TCP/IP connection to the host SATHYA, port 1433 has failed. Error: "SATHYA. Verify the connection properties. Make sure that an instance of SQL Server is running on the host and accepting TCP/IP connections at the port. Make sure that TCP connections to the port are not blocked by a firewall.".

Connect with Databricks Users in Your Area

Join a Regional User Group to connect with local Databricks users. Events will be happening in your city, and you wonโ€™t want to miss the chance to attend and share knowledge.

If there isnโ€™t a group near you, start one and help create a community that brings people together.

Request a New Group