02-08-2023 08:22 AM
Hi everyone!
I would like to know how spark stops the connection when reading from a sql database using the JDBC format.
Also, if there is a way to check when the connection is active or manually stop it, I also would like to know.
Thank you in advance!
02-09-2023 09:27 PM
Hi, the connection will be active, it connects through JDBC hostname/IP and port number. you can check the connectivity by running #[%sh] sh telnet <jdbcHostname> <jdbcPort>
Please refer to: https://docs.databricks.com/external-data/jdbc.html
Please let us know if this helps.
02-10-2023 01:40 AM
Hi, thank you for your help! I got that error
Also, do you know if there is a way to close the connection manually. I am building an ingestion pipeline that uses a master notebook that calls a child notebook. On the child notebook is where the JDBC connection is stablished. So, if after that notebook execution the connection closes I wont need to close it manually, otherwise I need or when that child notebook is called again new connection can be stablished
04-08-2023 12:30 AM
Hi @João Peixoto
Hope everything is going great.
Just wanted to check in if you were able to resolve your issue. If yes, would you be happy to mark an answer as best so that other members can find the solution more quickly? If not, please tell us so we can help you.
Cheers!
Join our fast-growing data practitioner and expert community of 80K+ members, ready to discover, help and collaborate together while making meaningful connections.
Click here to register and join today!
Engage in exciting technical discussions, join a group with your peers and meet our Featured Members.