<?xml version="1.0" encoding="UTF-8"?>
<rss xmlns:content="http://purl.org/rss/1.0/modules/content/" xmlns:dc="http://purl.org/dc/elements/1.1/" xmlns:rdf="http://www.w3.org/1999/02/22-rdf-syntax-ns#" xmlns:taxo="http://purl.org/rss/1.0/modules/taxonomy/" version="2.0">
  <channel>
    <title>topic Re: We are trying to connect to AWS RDS MySQL instance from DBX with PySpark using JDBC in Data Engineering</title>
    <link>https://community.databricks.com/t5/data-engineering/we-are-trying-to-connect-to-aws-rds-mysql-instance-from-dbx-with/m-p/98278#M39668</link>
    <description>&lt;P&gt;&lt;a href="https://community.databricks.com/t5/user/viewprofilepage/user-id/130038"&gt;@Gaurav_Lokhande&lt;/a&gt;&amp;nbsp; With Spark JDBC usage, connectivity happens between your Databricks VPC (in your AWS account) and RDS VPC, assuming you are using non-serverless clusters. You may need to ensure this connectivity works (like by peering).&lt;/P&gt;</description>
    <pubDate>Sat, 09 Nov 2024 19:29:22 GMT</pubDate>
    <dc:creator>arjun_kr</dc:creator>
    <dc:date>2024-11-09T19:29:22Z</dc:date>
    <item>
      <title>We are trying to connect to AWS RDS MySQL instance from DBX with PySpark using JDBC</title>
      <link>https://community.databricks.com/t5/data-engineering/we-are-trying-to-connect-to-aws-rds-mysql-instance-from-dbx-with/m-p/96595#M39300</link>
      <description>&lt;DIV&gt;We are trying to connect to AWS RDS MySQL instance from DBX with PySpark using JDBC:&lt;/DIV&gt;&lt;DIV&gt;&amp;nbsp;&lt;/DIV&gt;&lt;DIV&gt;jdbc_df = (spark.read.format("jdbc").options(url=f"jdbc:mysql://{creds['host']}:{creds['port']}/{creds['database']}", driver="com.mysql.cj.jdbc.Driver", dbtable="(SELECT * FROM table LIMIT 10) AS t", user=creds["user"], password=creds["password"]).load())&lt;/DIV&gt;&lt;DIV&gt;&amp;nbsp;&lt;/DIV&gt;&lt;DIV&gt;Any pointers in resolving the issues will be really appreciated.&lt;/DIV&gt;&lt;DIV&gt;&amp;nbsp;&lt;/DIV&gt;&lt;DIV&gt;Note: All the fields in the creds dictionary are correct.&lt;/DIV&gt;&lt;DIV&gt;&amp;nbsp;&lt;/DIV&gt;&lt;DIV&gt;We are running into the following error:&lt;/DIV&gt;&lt;DIV&gt;&amp;nbsp;&lt;/DIV&gt;&lt;DIV&gt;Py4JJavaError: An error occurred while calling o436.load.&lt;/DIV&gt;&lt;DIV&gt;: com.mysql.cj.jdbc.exceptions.CommunicationsException: Communications link failure&lt;/DIV&gt;&lt;DIV&gt;&amp;nbsp;&lt;/DIV&gt;&lt;DIV&gt;The last packet sent successfully to the server was 0 milliseconds ago. The driver has not received any packets from the server.&lt;/DIV&gt;&lt;DIV&gt;&lt;SPAN&gt;at com.mysql.cj.jdbc.exceptions.SQLError.createCommunicationsException(SQLError.java:165)&lt;/SPAN&gt;&lt;/DIV&gt;&lt;DIV&gt;&lt;SPAN&gt;at com.mysql.cj.jdbc.exceptions.SQLExceptionsMapping.translateException(SQLExceptionsMapping.java:55)&lt;/SPAN&gt;&lt;/DIV&gt;&lt;DIV&gt;&lt;SPAN&gt;at com.mysql.cj.jdbc.ConnectionImpl.createNewIO(ConnectionImpl.java:861)&lt;/SPAN&gt;&lt;/DIV&gt;&lt;DIV&gt;&lt;SPAN&gt;at com.mysql.cj.jdbc.ConnectionImpl.&amp;lt;init&amp;gt;(ConnectionImpl.java:449)&lt;/SPAN&gt;&lt;/DIV&gt;&lt;DIV&gt;&lt;SPAN&gt;at com.mysql.cj.jdbc.ConnectionImpl.getInstance(ConnectionImpl.java:234)&lt;/SPAN&gt;&lt;/DIV&gt;&lt;DIV&gt;&lt;SPAN&gt;at com.mysql.cj.jdbc.NonRegisteringDriver.connect(NonRegisteringDriver.java:180)&lt;/SPAN&gt;&lt;/DIV&gt;&lt;DIV&gt;&lt;SPAN&gt;at org.apache.spark.sql.execution.datasources.jdbc.connection.BasicConnectionProvider.getConnection(BasicConnectionProvider.scala:50)&lt;/SPAN&gt;&lt;/DIV&gt;&lt;DIV&gt;&lt;SPAN&gt;at org.apache.spark.sql.execution.datasources.jdbc.connection.ConnectionProviderBase.create(ConnectionProvider.scala:102)&lt;/SPAN&gt;&lt;/DIV&gt;&lt;DIV&gt;&lt;SPAN&gt;at org.apache.spark.sql.jdbc.JdbcDialect.$anonfun$createConnectionFactory$1(JdbcDialects.scala:211)&lt;/SPAN&gt;&lt;/DIV&gt;&lt;DIV&gt;&lt;SPAN&gt;at org.apache.spark.sql.jdbc.JdbcDialect.$anonfun$createConnectionFactory$1$adapted(JdbcDialects.scala:207)&lt;/SPAN&gt;&lt;/DIV&gt;&lt;DIV&gt;&lt;SPAN&gt;at org.apache.spark.sql.execution.datasources.jdbc.JDBCRDD$.getQueryOutputSchema(JDBCRDD.scala:73)&lt;/SPAN&gt;&lt;/DIV&gt;&lt;DIV&gt;&lt;SPAN&gt;at org.apache.spark.sql.execution.datasources.jdbc.JDBCRDD$.resolveTable(JDBCRDD.scala:68)&lt;/SPAN&gt;&lt;/DIV&gt;&lt;DIV&gt;&lt;SPAN&gt;at org.apache.spark.sql.execution.datasources.jdbc.JDBCRelation$.getSchema(JDBCRelation.scala:243)&lt;/SPAN&gt;&lt;/DIV&gt;&lt;DIV&gt;&lt;SPAN&gt;at org.apache.spark.sql.execution.datasources.jdbc.JdbcRelationProvider.createRelation(JdbcRelationProvider.scala:44)&lt;/SPAN&gt;&lt;/DIV&gt;&lt;DIV&gt;&lt;SPAN&gt;at org.apache.spark.sql.execution.datasources.DataSource.resolveRelation(DataSource.scala:398)&lt;/SPAN&gt;&lt;/DIV&gt;&lt;DIV&gt;&lt;SPAN&gt;at org.apache.spark.sql.DataFrameReader.loadV1Source(DataFrameReader.scala:394)&lt;/SPAN&gt;&lt;/DIV&gt;&lt;DIV&gt;&lt;SPAN&gt;at org.apache.spark.sql.DataFrameReader.$anonfun$load$2(DataFrameReader.scala:350)&lt;/SPAN&gt;&lt;/DIV&gt;&lt;DIV&gt;&lt;SPAN&gt;at scala.Option.getOrElse(Option.scala:189)&lt;/SPAN&gt;&lt;/DIV&gt;&lt;DIV&gt;&lt;SPAN&gt;at org.apache.spark.sql.DataFrameReader.load(DataFrameReader.scala:350)&lt;/SPAN&gt;&lt;/DIV&gt;&lt;DIV&gt;&lt;SPAN&gt;at org.apache.spark.sql.DataFrameReader.load(DataFrameReader.scala:236)&lt;/SPAN&gt;&lt;/DIV&gt;&lt;DIV&gt;&lt;SPAN&gt;at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)&lt;/SPAN&gt;&lt;/DIV&gt;&lt;DIV&gt;&lt;SPAN&gt;at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)&lt;/SPAN&gt;&lt;/DIV&gt;&lt;DIV&gt;&lt;SPAN&gt;at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)&lt;/SPAN&gt;&lt;/DIV&gt;&lt;DIV&gt;&lt;SPAN&gt;at java.lang.reflect.Method.invoke(Method.java:498)&lt;/SPAN&gt;&lt;/DIV&gt;&lt;DIV&gt;&lt;SPAN&gt;at py4j.reflection.MethodInvoker.invoke(MethodInvoker.java:244)&lt;/SPAN&gt;&lt;/DIV&gt;&lt;DIV&gt;&lt;SPAN&gt;at py4j.reflection.ReflectionEngine.invoke(ReflectionEngine.java:397)&lt;/SPAN&gt;&lt;/DIV&gt;&lt;DIV&gt;&lt;SPAN&gt;at py4j.Gateway.invoke(Gateway.java:306)&lt;/SPAN&gt;&lt;/DIV&gt;&lt;DIV&gt;&lt;SPAN&gt;at py4j.commands.AbstractCommand.invokeMethod(AbstractCommand.java:132)&lt;/SPAN&gt;&lt;/DIV&gt;&lt;DIV&gt;&lt;SPAN&gt;at py4j.commands.CallCommand.execute(CallCommand.java:79)&lt;/SPAN&gt;&lt;/DIV&gt;&lt;DIV&gt;&lt;SPAN&gt;at py4j.ClientServerConnection.waitForCommands(ClientServerConnection.java:199)&lt;/SPAN&gt;&lt;/DIV&gt;&lt;DIV&gt;&lt;SPAN&gt;at py4j.ClientServerConnection.run(ClientServerConnection.java:119)&lt;/SPAN&gt;&lt;/DIV&gt;&lt;DIV&gt;&lt;SPAN&gt;at java.lang.Thread.run(Thread.java:750)&lt;/SPAN&gt;&lt;/DIV&gt;&lt;DIV&gt;Caused by: com.mysql.cj.exceptions.CJCommunicationsException: Communications link failure&lt;/DIV&gt;&lt;DIV&gt;&amp;nbsp;&lt;/DIV&gt;&lt;DIV&gt;The last packet sent successfully to the server was 0 milliseconds ago. The driver has not received any packets from the server.&lt;/DIV&gt;&lt;DIV&gt;&lt;SPAN&gt;at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)&lt;/SPAN&gt;&lt;/DIV&gt;&lt;DIV&gt;&lt;SPAN&gt;at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:62)&lt;/SPAN&gt;&lt;/DIV&gt;&lt;DIV&gt;&lt;SPAN&gt;at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)&lt;/SPAN&gt;&lt;/DIV&gt;&lt;DIV&gt;&lt;SPAN&gt;at java.lang.reflect.Constructor.newInstance(Constructor.java:423)&lt;/SPAN&gt;&lt;/DIV&gt;&lt;DIV&gt;&lt;SPAN&gt;at com.mysql.cj.exceptions.ExceptionFactory.createException(ExceptionFactory.java:52)&lt;/SPAN&gt;&lt;/DIV&gt;&lt;DIV&gt;&lt;SPAN&gt;at com.mysql.cj.exceptions.ExceptionFactory.createException(ExceptionFactory.java:95)&lt;/SPAN&gt;&lt;/DIV&gt;&lt;DIV&gt;&lt;SPAN&gt;at com.mysql.cj.exceptions.ExceptionFactory.createException(ExceptionFactory.java:140)&lt;/SPAN&gt;&lt;/DIV&gt;&lt;DIV&gt;&lt;SPAN&gt;at com.mysql.cj.exceptions.ExceptionFactory.createCommunicationsException(ExceptionFactory.java:156)&lt;/SPAN&gt;&lt;/DIV&gt;&lt;DIV&gt;&lt;SPAN&gt;at com.mysql.cj.protocol.a.NativeSocketConnection.connect(NativeSocketConnection.java:79)&lt;/SPAN&gt;&lt;/DIV&gt;&lt;DIV&gt;&lt;SPAN&gt;at com.mysql.cj.NativeSession.connect(NativeSession.java:139)&lt;/SPAN&gt;&lt;/DIV&gt;&lt;DIV&gt;&lt;SPAN&gt;at com.mysql.cj.jdbc.ConnectionImpl.connectOneTryOnly(ConnectionImpl.java:980)&lt;/SPAN&gt;&lt;/DIV&gt;&lt;DIV&gt;&lt;SPAN&gt;at com.mysql.cj.jdbc.ConnectionImpl.createNewIO(ConnectionImpl.java:851)&lt;/SPAN&gt;&lt;/DIV&gt;&lt;DIV&gt;&lt;SPAN&gt;... 29 more&lt;/SPAN&gt;&lt;/DIV&gt;&lt;DIV&gt;Caused by: java.net.SocketTimeoutException: connect timed out&lt;/DIV&gt;&lt;DIV&gt;&lt;SPAN&gt;at java.net.PlainSocketImpl.socketConnect(Native Method)&lt;/SPAN&gt;&lt;/DIV&gt;&lt;DIV&gt;&lt;SPAN&gt;at java.net.AbstractPlainSocketImpl.doConnect(AbstractPlainSocketImpl.java:350)&lt;/SPAN&gt;&lt;/DIV&gt;&lt;DIV&gt;&lt;SPAN&gt;at java.net.AbstractPlainSocketImpl.connectToAddress(AbstractPlainSocketImpl.java:206)&lt;/SPAN&gt;&lt;/DIV&gt;&lt;DIV&gt;&lt;SPAN&gt;at java.net.AbstractPlainSocketImpl.connect(AbstractPlainSocketImpl.java:188)&lt;/SPAN&gt;&lt;/DIV&gt;&lt;DIV&gt;&lt;SPAN&gt;at java.net.SocksSocketImpl.connect(SocksSocketImpl.java:392)&lt;/SPAN&gt;&lt;/DIV&gt;&lt;DIV&gt;&lt;SPAN&gt;at java.net.Socket.connect(Socket.java:613)&lt;/SPAN&gt;&lt;/DIV&gt;&lt;DIV&gt;&lt;SPAN&gt;at com.mysql.cj.protocol.StandardSocketFactory.connect(StandardSocketFactory.java:144)&lt;/SPAN&gt;&lt;/DIV&gt;&lt;DIV&gt;&lt;SPAN&gt;at com.mysql.cj.protocol.a.NativeSocketConnection.connect(NativeSocketConnection.java:53)&lt;/SPAN&gt;&lt;/DIV&gt;&lt;P&gt;&amp;nbsp;&lt;/P&gt;</description>
      <pubDate>Tue, 29 Oct 2024 05:38:41 GMT</pubDate>
      <guid>https://community.databricks.com/t5/data-engineering/we-are-trying-to-connect-to-aws-rds-mysql-instance-from-dbx-with/m-p/96595#M39300</guid>
      <dc:creator>Gaurav_Lokhande</dc:creator>
      <dc:date>2024-10-29T05:38:41Z</dc:date>
    </item>
    <item>
      <title>Re: We are trying to connect to AWS RDS MySQL instance from DBX with PySpark using JDBC</title>
      <link>https://community.databricks.com/t5/data-engineering/we-are-trying-to-connect-to-aws-rds-mysql-instance-from-dbx-with/m-p/96991#M39387</link>
      <description>&lt;P&gt;Could you first verify, from a notebook, that network connectivity is properly working?&lt;/P&gt;
&lt;LI-CODE lang="markup"&gt;%sh nc -vz &amp;lt;jdbcHostname&amp;gt; &amp;lt;jdbcPort&amp;gt;&lt;/LI-CODE&gt;
&lt;P&gt;The&amp;nbsp;&lt;STRONG&gt;com.mysql.cj.exceptions.CJCommunicationsException: Communications link failure&lt;/STRONG&gt;, typically indicates a network connectivity issue between your Databricks cluster and the MySQL database hosted on AWS RDS. The specific error &lt;STRONG&gt;java.net.SocketTimeoutException: connect timed out&lt;/STRONG&gt; suggests that the connection attempt to the MySQL server is timing out. Other related aspects to be checked: RDS security groups, firewalls rules and settings (inbound/outbound), VPC peering or PrivateLink, etc.&amp;nbsp;&lt;/P&gt;</description>
      <pubDate>Thu, 31 Oct 2024 13:10:08 GMT</pubDate>
      <guid>https://community.databricks.com/t5/data-engineering/we-are-trying-to-connect-to-aws-rds-mysql-instance-from-dbx-with/m-p/96991#M39387</guid>
      <dc:creator>VZLA</dc:creator>
      <dc:date>2024-10-31T13:10:08Z</dc:date>
    </item>
    <item>
      <title>Re: We are trying to connect to AWS RDS MySQL instance from DBX with PySpark using JDBC</title>
      <link>https://community.databricks.com/t5/data-engineering/we-are-trying-to-connect-to-aws-rds-mysql-instance-from-dbx-with/m-p/97870#M39565</link>
      <description>&lt;P&gt;"The way the AWS MySQL RDS replica is whitelisted for Databricks IP addresses in &lt;A href="https://docs.databricks.com/en/resources/ip-domain-region.html" target="_self"&gt;IP addresses and domains for Databricks services and assets&lt;/A&gt;. Our AWS account is in us-east-1, and hence the IP addresses whitelisted are for Databricks, specifically the range 3.237.73.224/28. Are these the correct IP addresses for Databricks that need to be whitelisted?"&lt;/P&gt;</description>
      <pubDate>Wed, 06 Nov 2024 08:27:15 GMT</pubDate>
      <guid>https://community.databricks.com/t5/data-engineering/we-are-trying-to-connect-to-aws-rds-mysql-instance-from-dbx-with/m-p/97870#M39565</guid>
      <dc:creator>Gaurav_Lokhande</dc:creator>
      <dc:date>2024-11-06T08:27:15Z</dc:date>
    </item>
    <item>
      <title>Re: We are trying to connect to AWS RDS MySQL instance from DBX with PySpark using JDBC</title>
      <link>https://community.databricks.com/t5/data-engineering/we-are-trying-to-connect-to-aws-rds-mysql-instance-from-dbx-with/m-p/97970#M39576</link>
      <description>&lt;P&gt;"The way the AWS MySQL RDS replica is whitelisted for Databricks IP addresses in &lt;A href="https://docs.databricks.com/en/resources/ip-domain-region.html" target="_blank" rel="noopener"&gt;IP addresses and domains for Databricks services and assets&lt;/A&gt;. Our AWS account is in us-east-1, and hence the Ip addresses whitelisted are for Databricks, specifically the range 3.237.73.224/28&amp;nbsp;Are these the correct IP addresses for Databricks that need to be whitelisted?"&lt;/P&gt;</description>
      <pubDate>Wed, 06 Nov 2024 14:17:01 GMT</pubDate>
      <guid>https://community.databricks.com/t5/data-engineering/we-are-trying-to-connect-to-aws-rds-mysql-instance-from-dbx-with/m-p/97970#M39576</guid>
      <dc:creator>Gaurav_Lokhande</dc:creator>
      <dc:date>2024-11-06T14:17:01Z</dc:date>
    </item>
    <item>
      <title>Re: We are trying to connect to AWS RDS MySQL instance from DBX with PySpark using JDBC</title>
      <link>https://community.databricks.com/t5/data-engineering/we-are-trying-to-connect-to-aws-rds-mysql-instance-from-dbx-with/m-p/97977#M39577</link>
      <description>&lt;P&gt;Yes, that seems correct for the inbound traffic at least:&lt;/P&gt;
&lt;UL&gt;
&lt;LI&gt;&lt;STRONG&gt;Control plane services, including webapp&lt;/STRONG&gt;: nvirginia.cloud.databricks.com, 3.237.73.224/28&lt;/LI&gt;
&lt;LI&gt;&lt;STRONG&gt;SCC relay&lt;/STRONG&gt;: tunnel.us-east-1.cloud.databricks.com&lt;/LI&gt;
&lt;LI&gt;&lt;STRONG&gt;SCC relay for PrivateLink&lt;/STRONG&gt;: tunnel.privatelink.us-east-1.cloud.databricks.com&lt;/LI&gt;
&lt;/UL&gt;
&lt;P&gt;Same document can be referred for outbound.&lt;/P&gt;
&lt;P&gt;Is the netcat test going through?&lt;/P&gt;
&lt;P&gt;Some additional tests:&lt;/P&gt;
&lt;UL&gt;
&lt;LI&gt;
&lt;P class="_1t7bu9h1 paragraph"&gt;&lt;STRONG&gt;Security Groups and Firewalls&lt;/STRONG&gt;:&lt;/P&gt;
&lt;UL class="_1t7bu9h7 _1t7bu9h2"&gt;
&lt;LI&gt;&lt;SPAN&gt;Verify that the security group associated with your MySQL RDS instance allows inbound traffic on port 3306 (the default MySQL port) from the IP addresses or CIDR blocks used by your Databricks cluster.&lt;/SPAN&gt;&lt;/LI&gt;
&lt;LI&gt;Ensure there are no firewall rules blocking the connection.&lt;/LI&gt;
&lt;/UL&gt;
&lt;/LI&gt;
&lt;LI&gt;
&lt;P class="_1t7bu9h1 paragraph"&gt;&lt;STRONG&gt;VPC Peering&lt;/STRONG&gt;:&lt;/P&gt;
&lt;UL class="_1t7bu9h7 _1t7bu9h2"&gt;
&lt;LI&gt;If your Databricks workspace is in a different VPC than your RDS instance, ensure that VPC peering is correctly configured between the two VPCs.&lt;/LI&gt;
&lt;LI&gt;Check that the route tables and network ACLs are set up to allow traffic between the VPCs.&lt;/LI&gt;
&lt;/UL&gt;
&lt;/LI&gt;
&lt;/UL&gt;</description>
      <pubDate>Wed, 06 Nov 2024 14:38:03 GMT</pubDate>
      <guid>https://community.databricks.com/t5/data-engineering/we-are-trying-to-connect-to-aws-rds-mysql-instance-from-dbx-with/m-p/97977#M39577</guid>
      <dc:creator>VZLA</dc:creator>
      <dc:date>2024-11-06T14:38:03Z</dc:date>
    </item>
    <item>
      <title>Re: We are trying to connect to AWS RDS MySQL instance from DBX with PySpark using JDBC</title>
      <link>https://community.databricks.com/t5/data-engineering/we-are-trying-to-connect-to-aws-rds-mysql-instance-from-dbx-with/m-p/98202#M39644</link>
      <description>&lt;DIV&gt;Hi Community,&lt;/DIV&gt;&lt;DIV&gt;still, we are getting&amp;nbsp;&lt;SPAN class=""&gt;Py4JJavaError: &lt;/SPAN&gt;&lt;SPAN&gt;An error occurred while calling o476.jdbc. : com.mysql.cj.jdbc.exceptions.CommunicationsException: Communications link failure&lt;/SPAN&gt;&lt;/DIV&gt;&lt;DIV&gt;when we are running below in notebook -&amp;nbsp;&lt;/DIV&gt;&lt;DIV&gt;jdbc_df = (spark.read.format("jdbc").options(url=f"jdbc:mysql://{creds['host']}:{creds['port']}/{creds['database']}", driver="com.mysql.cj.jdbc.Driver", dbtable="(SELECT * FROM table LIMIT 10) AS t", user=creds["user"], password=creds["password"]).load())&lt;/DIV&gt;&lt;DIV&gt;&amp;nbsp;&lt;/DIV&gt;&lt;DIV&gt;Any pointers in resolving the issues will be really appreciated.&lt;/DIV&gt;&lt;DIV&gt;Note: All the fields in the creds dictionary are correct.&lt;/DIV&gt;&lt;DIV&gt;&lt;SPAN&gt;The &lt;/SPAN&gt;&lt;A class="" title="http://3.237.73.224/28" href="http://3.237.73.224/28" target="_blank" rel="noopener"&gt;3.237.73.224/28&lt;/A&gt;&lt;SPAN&gt; range was already whitelisted.&lt;/SPAN&gt;&lt;/DIV&gt;&lt;DIV&gt;&lt;SPAN&gt;There are no outbound restrictions.&lt;/SPAN&gt;&lt;/DIV&gt;&lt;DIV&gt;&amp;nbsp;&lt;/DIV&gt;&lt;DIV&gt;&amp;nbsp;&lt;/DIV&gt;</description>
      <pubDate>Fri, 08 Nov 2024 17:36:43 GMT</pubDate>
      <guid>https://community.databricks.com/t5/data-engineering/we-are-trying-to-connect-to-aws-rds-mysql-instance-from-dbx-with/m-p/98202#M39644</guid>
      <dc:creator>Gaurav_Lokhande</dc:creator>
      <dc:date>2024-11-08T17:36:43Z</dc:date>
    </item>
    <item>
      <title>Re: We are trying to connect to AWS RDS MySQL instance from DBX with PySpark using JDBC</title>
      <link>https://community.databricks.com/t5/data-engineering/we-are-trying-to-connect-to-aws-rds-mysql-instance-from-dbx-with/m-p/98265#M39663</link>
      <description>&lt;P&gt;&lt;a href="https://community.databricks.com/t5/user/viewprofilepage/user-id/130038"&gt;@Gaurav_Lokhande&lt;/a&gt;&amp;nbsp;You're receiving an "&lt;SPAN&gt;The last packet sent successfully to the server was 0 milliseconds ago. The driver has not received any packets from the server."&lt;/SPAN&gt;&lt;/P&gt;
&lt;P&gt;&lt;SPAN&gt;So you have a client (Driver) and a Server(Mysql). The message states it has successfully sent a packet to the Server, but it has not received anything from the Mysql, can you please confirm whether communication in both ways has been configured and allowed? e..g: In Databricks 3.237.73.224/28 outbound/inbound allow rule to/from Mysql:3306, and in Mysql inbound/outbound allow rule to/from&amp;nbsp;3.237.73.224/28 ?&lt;/SPAN&gt;&lt;/P&gt;</description>
      <pubDate>Sat, 09 Nov 2024 09:31:10 GMT</pubDate>
      <guid>https://community.databricks.com/t5/data-engineering/we-are-trying-to-connect-to-aws-rds-mysql-instance-from-dbx-with/m-p/98265#M39663</guid>
      <dc:creator>VZLA</dc:creator>
      <dc:date>2024-11-09T09:31:10Z</dc:date>
    </item>
    <item>
      <title>Re: We are trying to connect to AWS RDS MySQL instance from DBX with PySpark using JDBC</title>
      <link>https://community.databricks.com/t5/data-engineering/we-are-trying-to-connect-to-aws-rds-mysql-instance-from-dbx-with/m-p/98278#M39668</link>
      <description>&lt;P&gt;&lt;a href="https://community.databricks.com/t5/user/viewprofilepage/user-id/130038"&gt;@Gaurav_Lokhande&lt;/a&gt;&amp;nbsp; With Spark JDBC usage, connectivity happens between your Databricks VPC (in your AWS account) and RDS VPC, assuming you are using non-serverless clusters. You may need to ensure this connectivity works (like by peering).&lt;/P&gt;</description>
      <pubDate>Sat, 09 Nov 2024 19:29:22 GMT</pubDate>
      <guid>https://community.databricks.com/t5/data-engineering/we-are-trying-to-connect-to-aws-rds-mysql-instance-from-dbx-with/m-p/98278#M39668</guid>
      <dc:creator>arjun_kr</dc:creator>
      <dc:date>2024-11-09T19:29:22Z</dc:date>
    </item>
  </channel>
</rss>

