<?xml version="1.0" encoding="UTF-8"?>
<rss xmlns:content="http://purl.org/rss/1.0/modules/content/" xmlns:dc="http://purl.org/dc/elements/1.1/" xmlns:rdf="http://www.w3.org/1999/02/22-rdf-syntax-ns#" xmlns:taxo="http://purl.org/rss/1.0/modules/taxonomy/" version="2.0">
  <channel>
    <title>topic Re: Getting &amp;quot;socket closed&amp;quot; with query federation to Oracle DB on Amazon RDS in Data Engineering</title>
    <link>https://community.databricks.com/t5/data-engineering/getting-quot-socket-closed-quot-with-query-federation-to-oracle/m-p/150156#M53276</link>
    <description>&lt;P&gt;Hi &lt;a href="https://community.databricks.com/t5/user/viewprofilepage/user-id/169882"&gt;@luketl2&lt;/a&gt;,&lt;/P&gt;
&lt;P&gt;Based on your description and the stack trace you shared, the root cause is almost certainly an NNE (Native Network Encryption) algorithm mismatch between the Oracle JDBC thin driver that Databricks uses and your Oracle RDS server-side configuration. The key clue is this line in your stack trace:&lt;/P&gt;
&lt;P&gt;Authentication lapse 0 ms&lt;/P&gt;
&lt;P&gt;This means the TCP handshake completed successfully (which you already confirmed with nc -vz), the JDBC driver initiated the Oracle Net session, but encryption/integrity algorithm negotiation failed during authentication. Oracle then drops the socket before the authentication exchange finishes.&lt;/P&gt;
&lt;P&gt;Here is a systematic approach to diagnose and resolve this.&lt;/P&gt;
&lt;P&gt;&lt;BR /&gt;STEP 1: VERIFY THE SERVER-SIDE NNE ALGORITHM LIST&lt;/P&gt;
&lt;P&gt;On your Oracle RDS instance, check which encryption and integrity (checksum) algorithms are configured. You can query this from a privileged session:&lt;/P&gt;
&lt;P&gt;SELECT name, value&lt;BR /&gt;FROM v$parameter&lt;BR /&gt;WHERE name IN (&lt;BR /&gt;'sqlnet.encryption_types_server',&lt;BR /&gt;'sqlnet.encryption_server',&lt;BR /&gt;'sqlnet.crypto_checksum_types_server',&lt;BR /&gt;'sqlnet.crypto_checksum_server'&lt;BR /&gt;);&lt;/P&gt;
&lt;P&gt;Or if you have access to the RDS parameter group, look for these parameters:&lt;/P&gt;
&lt;P&gt;sqlnet.encryption_server = REQUESTED (you confirmed this)&lt;BR /&gt;sqlnet.encryption_types_server = (list of algorithms)&lt;BR /&gt;sqlnet.crypto_checksum_server = REQUESTED or ACCEPTED&lt;BR /&gt;sqlnet.crypto_checksum_types_server = (list of algorithms)&lt;/P&gt;
&lt;P&gt;The Databricks JDBC thin driver supports a specific set of NNE algorithms. If your server is configured with algorithms the thin driver does not support (for example, older algorithms like DES or 3DES112 only, or newer ones not yet in the driver), negotiation will fail silently and Oracle closes the socket.&lt;/P&gt;
&lt;P&gt;&lt;BR /&gt;STEP 2: ALIGN ENCRYPTION ALGORITHMS&lt;/P&gt;
&lt;P&gt;The Oracle JDBC thin driver typically supports these NNE encryption algorithms:&lt;/P&gt;
&lt;P&gt;AES256&lt;BR /&gt;AES192&lt;BR /&gt;AES128&lt;BR /&gt;3DES168&lt;/P&gt;
&lt;P&gt;And these integrity/checksum algorithms:&lt;/P&gt;
&lt;P&gt;SHA256&lt;BR /&gt;SHA384&lt;BR /&gt;SHA512&lt;BR /&gt;SHA1&lt;/P&gt;
&lt;P&gt;Make sure your RDS parameter group includes at least one algorithm from each list above. For example, in your RDS custom parameter group, set:&lt;/P&gt;
&lt;P&gt;sqlnet.encryption_types_server = AES256,AES192,AES128&lt;BR /&gt;sqlnet.crypto_checksum_types_server = SHA256,SHA1&lt;/P&gt;
&lt;P&gt;After changing RDS parameter group values, you may need to reboot the RDS instance for the changes to take effect (depending on whether the parameter is static or dynamic).&lt;/P&gt;
&lt;P&gt;&lt;BR /&gt;STEP 3: CHECK THE CRYPTO_CHECKSUM SETTING&lt;/P&gt;
&lt;P&gt;A commonly overlooked cause of "socket closed" during NNE negotiation is the integrity/checksum configuration. Even if encryption negotiation succeeds, if the checksum algorithms do not match, Oracle will close the connection. Make sure:&lt;/P&gt;
&lt;P&gt;sqlnet.crypto_checksum_server = REQUESTED (or ACCEPTED)&lt;BR /&gt;sqlnet.crypto_checksum_types_server includes at least SHA256 or SHA1&lt;/P&gt;
&lt;P&gt;&lt;BR /&gt;STEP 4: TEST WITH EXPLICIT JDBC PROPERTIES (OPTIONAL DIAGNOSTIC)&lt;/P&gt;
&lt;P&gt;Since you already tried using the ojdbc driver from Maven directly, you can further diagnose by setting explicit NNE properties on the JDBC connection. In a notebook on your cluster, try:&lt;/P&gt;
&lt;P&gt;jdbc_url = "jdbc:oracle:thin:@//YOUR_HOST:1521/YOUR_SERVICE_NAME"&lt;/P&gt;
&lt;P&gt;connection_properties = {&lt;BR /&gt;"user": "YOUR_USER",&lt;BR /&gt;"password": "YOUR_PASSWORD",&lt;BR /&gt;"oracle.net.encryption_client": "REQUESTED",&lt;BR /&gt;"oracle.net.encryption_types_client": "(AES256)",&lt;BR /&gt;"oracle.net.crypto_checksum_client": "REQUESTED",&lt;BR /&gt;"oracle.net.crypto_checksum_types_client": "(SHA256)",&lt;BR /&gt;"oracle.jdbc.timezoneAsRegion": "false"&lt;BR /&gt;}&lt;/P&gt;
&lt;P&gt;df = spark.read.format("jdbc") \&lt;BR /&gt;.option("url", jdbc_url) \&lt;BR /&gt;.option("dbtable", "(SELECT 1 FROM DUAL)") \&lt;BR /&gt;.options(**connection_properties) \&lt;BR /&gt;.load()&lt;/P&gt;
&lt;P&gt;df.show()&lt;/P&gt;
&lt;P&gt;If this works, the issue is confirmed as an algorithm mismatch. If it still fails, try changing the encryption_types_client and crypto_checksum_types_client values to match what your server supports.&lt;/P&gt;
&lt;P&gt;&lt;BR /&gt;STEP 5: CHECK FOR ORACLE RDS-SPECIFIC CONSIDERATIONS&lt;/P&gt;
&lt;P&gt;Amazon RDS for Oracle has specific behavior around NNE:&lt;/P&gt;
&lt;P&gt;1. NNE configuration is done through RDS option groups (not sqlnet.ora directly). Make sure the NATIVE_NETWORK_ENCRYPTION option is added to your RDS option group with compatible algorithms.&lt;/P&gt;
&lt;P&gt;2. In the RDS option group, look for these settings under NATIVE_NETWORK_ENCRYPTION:&lt;BR /&gt;- SQLNET.ENCRYPTION_SERVER&lt;BR /&gt;- SQLNET.ENCRYPTION_TYPES_SERVER&lt;BR /&gt;- SQLNET.CRYPTO_CHECKSUM_SERVER&lt;BR /&gt;- SQLNET.CRYPTO_CHECKSUM_TYPES_SERVER&lt;/P&gt;
&lt;P&gt;3. RDS documentation for NNE configuration:&lt;BR /&gt;&lt;A href="https://docs.aws.amazon.com/AmazonRDS/latest/UserGuide/Appendix.Oracle.Options.NetworkEncryption.html" target="_blank"&gt;https://docs.aws.amazon.com/AmazonRDS/latest/UserGuide/Appendix.Oracle.Options.NetworkEncryption.html&lt;/A&gt;&lt;/P&gt;
&lt;P&gt;&lt;BR /&gt;STEP 6: ENABLE JDBC DRIVER TRACING (ADDITIONAL DIAGNOSTICS)&lt;/P&gt;
&lt;P&gt;For more detail on what is happening during the connection attempt, you can enable Oracle JDBC thin driver tracing on your Databricks cluster. Add these Spark config properties to your cluster configuration:&lt;/P&gt;
&lt;P&gt;spark.driver.extraJavaOptions -Doracle.jdbc.Trace=true -Djava.util.logging.config.file=/tmp/ojdbc_logging.properties&lt;/P&gt;
&lt;P&gt;Create the logging properties file in an init script:&lt;/P&gt;
&lt;P&gt;cat &amp;gt; /tmp/ojdbc_logging.properties &amp;lt;&amp;lt; 'EOF'&lt;BR /&gt;.level=FINE&lt;BR /&gt;handlers=java.util.logging.FileHandler&lt;BR /&gt;java.util.logging.FileHandler.pattern=/tmp/ojdbc_trace.log&lt;BR /&gt;java.util.logging.FileHandler.limit=50000000&lt;BR /&gt;java.util.logging.FileHandler.count=1&lt;BR /&gt;java.util.logging.FileHandler.formatter=java.util.logging.SimpleFormatter&lt;BR /&gt;EOF&lt;/P&gt;
&lt;P&gt;After the failed connection attempt, check /tmp/ojdbc_trace.log on the driver for detailed NNE negotiation messages.&lt;/P&gt;
&lt;P&gt;&lt;BR /&gt;ADDITIONAL NOTES&lt;/P&gt;
&lt;P&gt;- The Databricks documentation confirms that Lakehouse Federation uses NNE (not TLS) for non-Oracle Cloud connections: &lt;A href="https://docs.databricks.com/aws/en/query-federation/oracle" target="_blank"&gt;https://docs.databricks.com/aws/en/query-federation/oracle&lt;/A&gt;&lt;/P&gt;
&lt;P&gt;- Make sure your Oracle RDS security group allows inbound traffic on port 1521 from the Databricks stable egress IPs (which you mentioned you have already done).&lt;/P&gt;
&lt;P&gt;- If you are using a SQL warehouse (serverless), make sure you have configured a Network Connectivity Configuration (NCC) and that the stable NAT IPs are whitelisted on the RDS side: &lt;A href="https://docs.databricks.com/aws/en/security/network/serverless-network-security/serverless-firewall.html" target="_blank"&gt;https://docs.databricks.com/aws/en/security/network/serverless-network-security/serverless-firewall.html&lt;/A&gt;&lt;/P&gt;
&lt;P&gt;- The "Socket read timed out" variant you saw with the Maven ojdbc driver points to the same root cause: the driver is waiting for the server to respond to its authentication/encryption handshake, but the server has already closed the socket due to algorithm negotiation failure.&lt;/P&gt;
&lt;P&gt;If you have tried all the above and are still seeing the issue, I would recommend opening a Databricks support ticket. The support team can inspect the serverless/cluster-side connection logs at a deeper level and help identify the exact negotiation failure point.&lt;/P&gt;
&lt;P&gt;* This reply used an agent system I built to research and draft this response based on the wide set of documentation I have available and previous memory. I personally review the draft for any obvious issues and for monitoring system reliability and update it when I detect any drift, but there is still a small chance that something is inaccurate, especially if you are experimenting with brand new features.&lt;/P&gt;</description>
    <pubDate>Sun, 08 Mar 2026 05:36:54 GMT</pubDate>
    <dc:creator>SteveOstrowski</dc:creator>
    <dc:date>2026-03-08T05:36:54Z</dc:date>
    <item>
      <title>Getting "socket closed" with query federation to Oracle DB on Amazon RDS</title>
      <link>https://community.databricks.com/t5/data-engineering/getting-quot-socket-closed-quot-with-query-federation-to-oracle/m-p/145573#M52533</link>
      <description>&lt;P&gt;&lt;SPAN&gt;I am following this guide to connect to an Oracle DB in Amazon RDS:&amp;nbsp;&lt;A href="https://docs.databricks.com/aws/en/query-federation/oracle" target="_blank"&gt;https://docs.databricks.com/aws/en/query-federation/oracle&lt;/A&gt;. I've created the connection, but when I go to test it, it loads for a while and then says "socket closed". From my understanding, this implies that the TCP session establishes (networking is fine) but immediately is closed by Oracle DB for whatever reason. I've gone through all the possibilities: we are using NNE, we checked and server-side NNE is at "REQUESTED" level (should be higher than "ACCEPTED"), we have stable egress IP and it is whitelisted on server side, and we have double-checked the service name. Not sure what else to do. Is there any way to get more info on the Databricks side? I tried checking driver logs but I did not see anything useful. Thanks&lt;/SPAN&gt;&lt;/P&gt;</description>
      <pubDate>Wed, 28 Jan 2026 16:26:30 GMT</pubDate>
      <guid>https://community.databricks.com/t5/data-engineering/getting-quot-socket-closed-quot-with-query-federation-to-oracle/m-p/145573#M52533</guid>
      <dc:creator>luketl2</dc:creator>
      <dc:date>2026-01-28T16:26:30Z</dc:date>
    </item>
    <item>
      <title>Re: Getting "socket closed" with query federation to Oracle DB on Amazon RDS</title>
      <link>https://community.databricks.com/t5/data-engineering/getting-quot-socket-closed-quot-with-query-federation-to-oracle/m-p/146700#M52678</link>
      <description>&lt;P&gt;Hi&amp;nbsp;&lt;a href="https://community.databricks.com/t5/user/viewprofilepage/user-id/169882"&gt;@luketl2&lt;/a&gt;&amp;nbsp;Could you please provide us the complete stack trace?&amp;nbsp;&lt;/P&gt;</description>
      <pubDate>Tue, 03 Feb 2026 10:28:43 GMT</pubDate>
      <guid>https://community.databricks.com/t5/data-engineering/getting-quot-socket-closed-quot-with-query-federation-to-oracle/m-p/146700#M52678</guid>
      <dc:creator>Saritha_S</dc:creator>
      <dc:date>2026-02-03T10:28:43Z</dc:date>
    </item>
    <item>
      <title>Re: Getting "socket closed" with query federation to Oracle DB on Amazon RDS</title>
      <link>https://community.databricks.com/t5/data-engineering/getting-quot-socket-closed-quot-with-query-federation-to-oracle/m-p/146738#M52694</link>
      <description>&lt;DIV&gt;A few notes: we know that my IP from the cluster I am testing on is whitelisted, and I have run nc -vz to their IP and port and it returns "open," so I am confident we establish tcp. Nothing in stderr on driver, but here is what I see in log4j:&lt;BR /&gt;&lt;BR /&gt;26/02/03 15:19:41 INFO ClusterLoadMonitor: Removed query with execution ID:9. Current active queries:0&lt;/DIV&gt;&lt;DIV&gt;26/02/03 15:19:41 INFO SQLExecution:&amp;nbsp; 0 QueryExecution(s) are running&lt;/DIV&gt;&lt;DIV&gt;26/02/03 15:19:41 INFO CurrentQueryContext: Thread Thread[WRAPPER-ReplId-19c23-fe347-5,5,main]: current category Some(EXECUTABLE_COMMAND), restoring to previous category Some(UNDETERMINED).&lt;/DIV&gt;&lt;DIV&gt;26/02/03 15:19:41 ERROR SQLDriverLocal: Error in SQL query: -- This is a system generated query from catalog explorer&lt;/DIV&gt;&lt;DIV&gt;CREATE FOREIGN CATALOG `foreign_data` USING CONNECTION `foreign_db` OPTIONS (service_name 'db_a', test_catalog 'true')&lt;/DIV&gt;&lt;DIV&gt;org.apache.spark.SparkIllegalArgumentException: [CANNOT_ESTABLISH_CONNECTION] Cannot establish connection to remote ORACLE database. Please check connection information and credentials e.g. host, port, user, password and database options. ** If you believe the information is correct, please check your workspace's network setup and ensure it does not have outbound restrictions to the host. Please also check that the host does not block inbound connections from the network where the workspace's Spark clusters are deployed. ** Detailed error message: ORA-17002: I/O error: Socket read interrupted, Authentication lapse 0 ms.&lt;/DIV&gt;&lt;DIV&gt;&lt;A href="https://docs.oracle.com/error-help/db/ora-17002/" target="_blank"&gt;https://docs.oracle.com/error-help/db/ora-17002/&lt;/A&gt;. SQLSTATE: 08001&lt;/DIV&gt;&lt;DIV&gt;&lt;SPAN&gt;at org.apache.spark.sql.errors.QueryExecutionErrors$.cannotEstablishConnectionError(QueryExecutionErrors.scala:1278)&lt;/SPAN&gt;&lt;/DIV&gt;&lt;DIV&gt;&lt;SPAN&gt;at org.apache.spark.sql.execution.datasources.jdbc.JDBCDialectTestConnection.testCatalogConnection(JDBCTestConnection.scala:86)&lt;/SPAN&gt;&lt;/DIV&gt;&lt;DIV&gt;&lt;SPAN&gt;at com.databricks.sql.managedcatalog.command.QueryFederationCommand$.testCatalogConnection(queryFederationCommandsExec.scala:153)&lt;/SPAN&gt;&lt;/DIV&gt;&lt;DIV&gt;&lt;SPAN&gt;at com.databricks.sql.managedcatalog.command.TestForeignCatalogConnectionCommand.run(queryFederationCommandsExec.scala:302)&lt;/SPAN&gt;&lt;/DIV&gt;&lt;DIV&gt;&lt;SPAN&gt;at org.apache.spark.sql.execution.command.ExecutedCommandExec.$anonfun$sideEffectResult$2(commands.scala:84)&lt;/SPAN&gt;&lt;/DIV&gt;&lt;DIV&gt;&lt;SPAN&gt;at org.apache.spark.sql.execution.SparkPlan.runCommandInAetherOrSpark(SparkPlan.scala:194)&lt;/SPAN&gt;&lt;/DIV&gt;&lt;DIV&gt;&lt;SPAN&gt;at org.apache.spark.sql.execution.command.ExecutedCommandExec.$anonfun$sideEffectResult$1(commands.scala:84)&lt;/SPAN&gt;&lt;/DIV&gt;&lt;DIV&gt;&lt;SPAN&gt;at com.databricks.spark.util.FrameProfiler$.record(FrameProfiler.scala:94)&lt;/SPAN&gt;&lt;/DIV&gt;&lt;DIV&gt;&lt;SPAN&gt;at org.apache.spark.sql.execution.command.ExecutedCommandExec.sideEffectResult$lzycompute(commands.scala:81)&lt;/SPAN&gt;&lt;/DIV&gt;&lt;DIV&gt;&lt;SPAN&gt;at org.apache.spark.sql.execution.command.ExecutedCommandExec.sideEffectResult(commands.scala:80)&lt;/SPAN&gt;&lt;/DIV&gt;&lt;DIV&gt;&lt;SPAN&gt;at org.apache.spark.sql.execution.command.ExecutedCommandExec.executeCollect(commands.scala:94)&lt;/SPAN&gt;&lt;/DIV&gt;&lt;DIV&gt;&lt;SPAN&gt;at org.apache.spark.sql.execution.QueryExecution.$anonfun$eagerlyExecuteCommands$5(QueryExecution.scala:553)&lt;/SPAN&gt;&lt;/DIV&gt;&lt;DIV&gt;&lt;SPAN&gt;at com.databricks.util.LexicalThreadLocal$Handle.runWith(LexicalThreadLocal.scala:63)&lt;/SPAN&gt;&lt;/DIV&gt;&lt;DIV&gt;&lt;SPAN&gt;at org.apache.spark.sql.execution.QueryExecution.$anonfun$eagerlyExecuteCommands$4(QueryExecution.scala:553)&lt;/SPAN&gt;&lt;/DIV&gt;&lt;DIV&gt;&lt;SPAN&gt;at org.apache.spark.sql.catalyst.QueryPlanningTracker$.withTracker(QueryPlanningTracker.scala:233)&lt;/SPAN&gt;&lt;/DIV&gt;&lt;DIV&gt;&lt;SPAN&gt;at org.apache.spark.sql.execution.QueryExecution.$anonfun$eagerlyExecuteCommands$3(QueryExecution.scala:552)&lt;/SPAN&gt;&lt;/DIV&gt;&lt;DIV&gt;&lt;SPAN&gt;at org.apache.spark.sql.execution.SQLExecution$.$anonfun$withNewExecutionId0$11(SQLExecution.scala:486)&lt;/SPAN&gt;&lt;/DIV&gt;&lt;DIV&gt;&lt;SPAN&gt;at com.databricks.sql.util.MemoryTrackerHelper.withMemoryTracking(MemoryTrackerHelper.scala:80)&lt;/SPAN&gt;&lt;/DIV&gt;&lt;DIV&gt;&lt;SPAN&gt;at org.apache.spark.sql.execution.SQLExecution$.$anonfun$withNewExecutionId0$10(SQLExecution.scala:422)&lt;/SPAN&gt;&lt;/DIV&gt;&lt;DIV&gt;&lt;SPAN&gt;at org.apache.spark.sql.execution.SQLExecution$.withSQLConfPropagated(SQLExecution.scala:810)&lt;/SPAN&gt;&lt;/DIV&gt;&lt;DIV&gt;&lt;SPAN&gt;at org.apache.spark.sql.execution.SQLExecution$.$anonfun$withNewExecutionId0$1(SQLExecution.scala:352)&lt;/SPAN&gt;&lt;/DIV&gt;&lt;DIV&gt;&lt;SPAN&gt;at org.apache.spark.sql.SparkSession.withActive(SparkSession.scala:1481)&lt;/SPAN&gt;&lt;/DIV&gt;&lt;DIV&gt;&lt;SPAN&gt;at org.apache.spark.sql.execution.SQLExecution$.withNewExecutionId0(SQLExecution.scala:217)&lt;/SPAN&gt;&lt;/DIV&gt;&lt;DIV&gt;&lt;SPAN&gt;at org.apache.spark.sql.execution.SQLExecution$.withNewExecutionId(SQLExecution.scala:747)&lt;/SPAN&gt;&lt;/DIV&gt;&lt;DIV&gt;&lt;SPAN&gt;at org.apache.spark.sql.execution.QueryExecution.$anonfun$eagerlyExecuteCommands$2(QueryExecution.scala:548)&lt;/SPAN&gt;&lt;/DIV&gt;&lt;DIV&gt;&lt;SPAN&gt;at org.apache.spark.sql.execution.QueryExecution$.withInternalError(QueryExecution.scala:1426)&lt;/SPAN&gt;&lt;/DIV&gt;&lt;DIV&gt;&lt;SPAN&gt;at org.apache.spark.sql.execution.QueryExecution.$anonfun$eagerlyExecuteCommands$1(QueryExecution.scala:544)&lt;/SPAN&gt;&lt;/DIV&gt;&lt;DIV&gt;&lt;SPAN&gt;at org.apache.spark.sql.execution.QueryExecution.withMVTagsIfNecessary(QueryExecution.scala:473)&lt;/SPAN&gt;&lt;/DIV&gt;&lt;DIV&gt;&lt;SPAN&gt;at org.apache.spark.sql.execution.QueryExecution.org$apache$spark$sql$execution$QueryExecution$$eagerlyExecute$1(QueryExecution.scala:542)&lt;/SPAN&gt;&lt;/DIV&gt;&lt;DIV&gt;&lt;SPAN&gt;at org.apache.spark.sql.execution.QueryExecution$$anonfun$$nestedInanonfun$eagerlyExecuteCommands$8$1.applyOrElse(QueryExecution.scala:621)&lt;/SPAN&gt;&lt;/DIV&gt;&lt;DIV&gt;&lt;SPAN&gt;at org.apache.spark.sql.execution.QueryExecution$$anonfun$$nestedInanonfun$eagerlyExecuteCommands$8$1.applyOrElse(QueryExecution.scala:616)&lt;/SPAN&gt;&lt;/DIV&gt;&lt;DIV&gt;&lt;SPAN&gt;at org.apache.spark.sql.catalyst.trees.TreeNode.$anonfun$transformDownWithPruning$1(TreeNode.scala:521)&lt;/SPAN&gt;&lt;/DIV&gt;&lt;DIV&gt;&lt;SPAN&gt;at org.apache.spark.sql.catalyst.trees.CurrentOrigin$.withOrigin(origin.scala:85)&lt;/SPAN&gt;&lt;/DIV&gt;&lt;DIV&gt;&lt;SPAN&gt;at org.apache.spark.sql.catalyst.trees.TreeNode.transformDownWithPruning(TreeNode.scala:521)&lt;/SPAN&gt;&lt;/DIV&gt;&lt;DIV&gt;&lt;SPAN&gt;at org.apache.spark.sql.catalyst.plans.logical.LogicalPlan.org$apache$spark$sql$catalyst$plans$logical$AnalysisHelper$$super$transformDownWithPruning(LogicalPlan.scala:42)&lt;/SPAN&gt;&lt;/DIV&gt;&lt;DIV&gt;&lt;SPAN&gt;at org.apache.spark.sql.catalyst.plans.logical.AnalysisHelper.transformDownWithPruning(AnalysisHelper.scala:361)&lt;/SPAN&gt;&lt;/DIV&gt;&lt;DIV&gt;&lt;SPAN&gt;at org.apache.spark.sql.catalyst.plans.logical.AnalysisHelper.transformDownWithPruning$(AnalysisHelper.scala:357)&lt;/SPAN&gt;&lt;/DIV&gt;&lt;DIV&gt;&lt;SPAN&gt;at org.apache.spark.sql.catalyst.plans.logical.LogicalPlan.transformDownWithPruning(LogicalPlan.scala:42)&lt;/SPAN&gt;&lt;/DIV&gt;&lt;DIV&gt;&lt;SPAN&gt;at org.apache.spark.sql.catalyst.plans.logical.LogicalPlan.transformDownWithPruning(LogicalPlan.scala:42)&lt;/SPAN&gt;&lt;/DIV&gt;&lt;DIV&gt;&lt;SPAN&gt;at org.apache.spark.sql.catalyst.trees.TreeNode.transformDown(TreeNode.scala:497)&lt;/SPAN&gt;&lt;/DIV&gt;&lt;DIV&gt;&lt;SPAN&gt;at org.apache.spark.sql.execution.QueryExecution.$anonfun$eagerlyExecuteCommands$8(QueryExecution.scala:616)&lt;/SPAN&gt;&lt;/DIV&gt;&lt;DIV&gt;&lt;SPAN&gt;at org.apache.spark.sql.catalyst.plans.logical.AnalysisHelper$.allowInvokingTransformsInAnalyzer(AnalysisHelper.scala:418)&lt;/SPAN&gt;&lt;/DIV&gt;&lt;DIV&gt;&lt;SPAN&gt;at org.apache.spark.sql.execution.QueryExecution.eagerlyExecuteCommands(QueryExecution.scala:616)&lt;/SPAN&gt;&lt;/DIV&gt;&lt;DIV&gt;&lt;SPAN&gt;at org.apache.spark.sql.execution.QueryExecution.$anonfun$lazyCommandExecuted$1(QueryExecution.scala:429)&lt;/SPAN&gt;&lt;/DIV&gt;&lt;DIV&gt;&lt;SPAN&gt;at scala.util.Try$.apply(Try.scala:213)&lt;/SPAN&gt;&lt;/DIV&gt;&lt;DIV&gt;&lt;SPAN&gt;at org.apache.spark.util.Utils$.doTryWithCallerStacktrace(Utils.scala:1684)&lt;/SPAN&gt;&lt;/DIV&gt;&lt;DIV&gt;&lt;SPAN&gt;at org.apache.spark.util.Utils$.getTryWithCallerStacktrace(Utils.scala:1745)&lt;/SPAN&gt;&lt;/DIV&gt;&lt;DIV&gt;&lt;SPAN&gt;at org.apache.spark.util.LazyTry.get(LazyTry.scala:58)&lt;/SPAN&gt;&lt;/DIV&gt;&lt;DIV&gt;&lt;SPAN&gt;at org.apache.spark.sql.execution.QueryExecution.commandExecuted(QueryExecution.scala:434)&lt;/SPAN&gt;&lt;/DIV&gt;&lt;DIV&gt;&lt;SPAN&gt;at org.apache.spark.sql.Dataset.&amp;lt;init&amp;gt;(Dataset.scala:406)&lt;/SPAN&gt;&lt;/DIV&gt;&lt;DIV&gt;&lt;SPAN&gt;at org.apache.spark.sql.Dataset$.$anonfun$ofRows$3(Dataset.scala:150)&lt;/SPAN&gt;&lt;/DIV&gt;&lt;DIV&gt;&lt;SPAN&gt;at org.apache.spark.sql.SparkSession.withActive(SparkSession.scala:1481)&lt;/SPAN&gt;&lt;/DIV&gt;&lt;DIV&gt;&lt;SPAN&gt;at org.apache.spark.sql.SparkSession.$anonfun$withActiveAndFrameProfiler$1(SparkSession.scala:1488)&lt;/SPAN&gt;&lt;/DIV&gt;&lt;DIV&gt;&lt;SPAN&gt;at com.databricks.spark.util.FrameProfiler$.record(FrameProfiler.scala:94)&lt;/SPAN&gt;&lt;/DIV&gt;&lt;DIV&gt;&lt;SPAN&gt;at org.apache.spark.sql.SparkSession.withActiveAndFrameProfiler(SparkSession.scala:1488)&lt;/SPAN&gt;&lt;/DIV&gt;&lt;DIV&gt;&lt;SPAN&gt;at org.apache.spark.sql.Dataset$.ofRows(Dataset.scala:141)&lt;/SPAN&gt;&lt;/DIV&gt;&lt;DIV&gt;&lt;SPAN&gt;at org.apache.spark.sql.SparkSession.$anonfun$sql$4(SparkSession.scala:1161)&lt;/SPAN&gt;&lt;/DIV&gt;&lt;DIV&gt;&lt;SPAN&gt;at org.apache.spark.sql.SparkSession.withActive(SparkSession.scala:1481)&lt;/SPAN&gt;&lt;/DIV&gt;&lt;DIV&gt;&lt;SPAN&gt;at org.apache.spark.sql.SparkSession.sql(SparkSession.scala:1113)&lt;/SPAN&gt;&lt;/DIV&gt;&lt;DIV&gt;&lt;SPAN&gt;at org.apache.spark.sql.SparkSession.sql(SparkSession.scala:1185)&lt;/SPAN&gt;&lt;/DIV&gt;&lt;DIV&gt;&lt;SPAN&gt;at com.databricks.backend.daemon.driver.DriverLocal$DbClassicStrategy.executeSQLQuery(DriverLocal.scala:372)&lt;/SPAN&gt;&lt;/DIV&gt;&lt;DIV&gt;&lt;SPAN&gt;at com.databricks.backend.daemon.driver.DriverLocal.executeSQLSubCommand(DriverLocal.scala:472)&lt;/SPAN&gt;&lt;/DIV&gt;&lt;DIV&gt;&lt;SPAN&gt;at com.databricks.backend.daemon.driver.DriverLocal.$anonfun$executeSql$1(DriverLocal.scala:496)&lt;/SPAN&gt;&lt;/DIV&gt;&lt;DIV&gt;&lt;SPAN&gt;at scala.collection.immutable.List.map(List.scala:293)&lt;/SPAN&gt;&lt;/DIV&gt;&lt;DIV&gt;&lt;SPAN&gt;at com.databricks.backend.daemon.driver.DriverLocal.executeSql(DriverLocal.scala:491)&lt;/SPAN&gt;&lt;/DIV&gt;&lt;DIV&gt;&lt;SPAN&gt;at com.databricks.backend.daemon.driver.SQLDriverLocal.repl(SQLDriverLocal.scala:39)&lt;/SPAN&gt;&lt;/DIV&gt;&lt;DIV&gt;&lt;SPAN&gt;at com.databricks.backend.daemon.driver.DriverLocal.$anonfun$execute$36(DriverLocal.scala:1321)&lt;/SPAN&gt;&lt;/DIV&gt;&lt;DIV&gt;&lt;SPAN&gt;at com.databricks.unity.UCSEphemeralState$Handle.runWith(UCSEphemeralState.scala:51)&lt;/SPAN&gt;&lt;/DIV&gt;&lt;DIV&gt;&lt;SPAN&gt;at com.databricks.unity.HandleImpl.runWith(UCSHandle.scala:104)&lt;/SPAN&gt;&lt;/DIV&gt;&lt;DIV&gt;&lt;SPAN&gt;at com.databricks.backend.daemon.driver.DriverLocal.$anonfun$execute$30(DriverLocal.scala:1312)&lt;/SPAN&gt;&lt;/DIV&gt;&lt;DIV&gt;&lt;SPAN&gt;at com.databricks.logging.AttributionContextTracing.$anonfun$withAttributionContext$1(AttributionContextTracing.scala:49)&lt;/SPAN&gt;&lt;/DIV&gt;&lt;DIV&gt;&lt;SPAN&gt;at com.databricks.logging.AttributionContext$.$anonfun$withValue$1(AttributionContext.scala:293)&lt;/SPAN&gt;&lt;/DIV&gt;&lt;DIV&gt;&lt;SPAN&gt;at scala.util.DynamicVariable.withValue(DynamicVariable.scala:62)&lt;/SPAN&gt;&lt;/DIV&gt;&lt;DIV&gt;&lt;SPAN&gt;at com.databricks.logging.AttributionContext$.withValue(AttributionContext.scala:289)&lt;/SPAN&gt;&lt;/DIV&gt;&lt;DIV&gt;&lt;SPAN&gt;at com.databricks.logging.AttributionContextTracing.withAttributionContext(AttributionContextTracing.scala:47)&lt;/SPAN&gt;&lt;/DIV&gt;&lt;DIV&gt;&lt;SPAN&gt;at com.databricks.logging.AttributionContextTracing.withAttributionContext$(AttributionContextTracing.scala:44)&lt;/SPAN&gt;&lt;/DIV&gt;&lt;DIV&gt;&lt;SPAN&gt;at com.databricks.backend.daemon.driver.DriverLocal.withAttributionContext(DriverLocal.scala:130)&lt;/SPAN&gt;&lt;/DIV&gt;&lt;DIV&gt;&lt;SPAN&gt;at com.databricks.logging.AttributionContextTracing.withAttributionTags(AttributionContextTracing.scala:96)&lt;/SPAN&gt;&lt;/DIV&gt;&lt;DIV&gt;&lt;SPAN&gt;at com.databricks.logging.AttributionContextTracing.withAttributionTags$(AttributionContextTracing.scala:77)&lt;/SPAN&gt;&lt;/DIV&gt;&lt;DIV&gt;&lt;SPAN&gt;at com.databricks.backend.daemon.driver.DriverLocal.withAttributionTags(DriverLocal.scala:130)&lt;/SPAN&gt;&lt;/DIV&gt;&lt;DIV&gt;&lt;SPAN&gt;at com.databricks.backend.daemon.driver.DriverLocal.$anonfun$execute$1(DriverLocal.scala:1236)&lt;/SPAN&gt;&lt;/DIV&gt;&lt;DIV&gt;&lt;SPAN&gt;at com.databricks.backend.daemon.driver.DriverLocal$.$anonfun$maybeSynchronizeExecution$4(DriverLocal.scala:1721)&lt;/SPAN&gt;&lt;/DIV&gt;&lt;DIV&gt;&lt;SPAN&gt;at com.databricks.backend.daemon.driver.DriverLocal.execute(DriverLocal.scala:879)&lt;/SPAN&gt;&lt;/DIV&gt;&lt;DIV&gt;&lt;SPAN&gt;at com.databricks.backend.daemon.driver.DriverWrapper.$anonfun$tryExecutingCommand$2(DriverWrapper.scala:1054)&lt;/SPAN&gt;&lt;/DIV&gt;&lt;DIV&gt;&lt;SPAN&gt;at scala.util.Try$.apply(Try.scala:213)&lt;/SPAN&gt;&lt;/DIV&gt;&lt;DIV&gt;&lt;SPAN&gt;at com.databricks.backend.daemon.driver.DriverWrapper.$anonfun$tryExecutingCommand$1(DriverWrapper.scala:1043)&lt;/SPAN&gt;&lt;/DIV&gt;&lt;DIV&gt;&lt;SPAN&gt;at com.databricks.backend.daemon.driver.DriverWrapper.$anonfun$tryExecutingCommand$3(DriverWrapper.scala:1089)&lt;/SPAN&gt;&lt;/DIV&gt;&lt;DIV&gt;&lt;SPAN&gt;at com.databricks.logging.UsageLogging.executeThunkAndCaptureResultTags$1(UsageLogging.scala:616)&lt;/SPAN&gt;&lt;/DIV&gt;&lt;DIV&gt;&lt;SPAN&gt;at com.databricks.logging.UsageLogging.$anonfun$recordOperationWithResultTags$4(UsageLogging.scala:643)&lt;/SPAN&gt;&lt;/DIV&gt;&lt;DIV&gt;&lt;SPAN&gt;at com.databricks.logging.AttributionContextTracing.$anonfun$withAttributionContext$1(AttributionContextTracing.scala:49)&lt;/SPAN&gt;&lt;/DIV&gt;&lt;DIV&gt;&lt;SPAN&gt;at com.databricks.logging.AttributionContext$.$anonfun$withValue$1(AttributionContext.scala:293)&lt;/SPAN&gt;&lt;/DIV&gt;&lt;DIV&gt;&lt;SPAN&gt;at scala.util.DynamicVariable.withValue(DynamicVariable.scala:62)&lt;/SPAN&gt;&lt;/DIV&gt;&lt;DIV&gt;&lt;SPAN&gt;at com.databricks.logging.AttributionContext$.withValue(AttributionContext.scala:289)&lt;/SPAN&gt;&lt;/DIV&gt;&lt;DIV&gt;&lt;SPAN&gt;at com.databricks.logging.AttributionContextTracing.withAttributionContext(AttributionContextTracing.scala:47)&lt;/SPAN&gt;&lt;/DIV&gt;&lt;DIV&gt;&lt;SPAN&gt;at com.databricks.logging.AttributionContextTracing.withAttributionContext$(AttributionContextTracing.scala:44)&lt;/SPAN&gt;&lt;/DIV&gt;&lt;DIV&gt;&lt;SPAN&gt;at com.databricks.backend.daemon.driver.DriverWrapper.withAttributionContext(DriverWrapper.scala:81)&lt;/SPAN&gt;&lt;/DIV&gt;&lt;DIV&gt;&lt;SPAN&gt;at com.databricks.logging.AttributionContextTracing.withAttributionTags(AttributionContextTracing.scala:96)&lt;/SPAN&gt;&lt;/DIV&gt;&lt;DIV&gt;&lt;SPAN&gt;at com.databricks.logging.AttributionContextTracing.withAttributionTags$(AttributionContextTracing.scala:77)&lt;/SPAN&gt;&lt;/DIV&gt;&lt;DIV&gt;&lt;SPAN&gt;at com.databricks.backend.daemon.driver.DriverWrapper.withAttributionTags(DriverWrapper.scala:81)&lt;/SPAN&gt;&lt;/DIV&gt;&lt;DIV&gt;&lt;SPAN&gt;at com.databricks.logging.UsageLogging.recordOperationWithResultTags(UsageLogging.scala:611)&lt;/SPAN&gt;&lt;/DIV&gt;&lt;DIV&gt;&lt;SPAN&gt;at com.databricks.logging.UsageLogging.recordOperationWithResultTags$(UsageLogging.scala:519)&lt;/SPAN&gt;&lt;/DIV&gt;&lt;DIV&gt;&lt;SPAN&gt;at com.databricks.backend.daemon.driver.DriverWrapper.recordOperationWithResultTags(DriverWrapper.scala:81)&lt;/SPAN&gt;&lt;/DIV&gt;&lt;DIV&gt;&lt;SPAN&gt;at com.databricks.backend.daemon.driver.DriverWrapper.tryExecutingCommand(DriverWrapper.scala:1089)&lt;/SPAN&gt;&lt;/DIV&gt;&lt;DIV&gt;&lt;SPAN&gt;at com.databricks.backend.daemon.driver.DriverWrapper.executeCommandAndGetError(DriverWrapper.scala:766)&lt;/SPAN&gt;&lt;/DIV&gt;&lt;DIV&gt;&lt;SPAN&gt;at com.databricks.backend.daemon.driver.DriverWrapper.executeCommand(DriverWrapper.scala:859)&lt;/SPAN&gt;&lt;/DIV&gt;&lt;DIV&gt;&lt;SPAN&gt;at com.databricks.backend.daemon.driver.DriverWrapper.$anonfun$runInnerLoop$1(DriverWrapper.scala:630)&lt;/SPAN&gt;&lt;/DIV&gt;&lt;DIV&gt;&lt;SPAN&gt;at scala.runtime.java8.JFunction0$mcV$sp.apply(JFunction0$mcV$sp.java:23)&lt;/SPAN&gt;&lt;/DIV&gt;&lt;DIV&gt;&lt;SPAN&gt;at com.databricks.logging.AttributionContextTracing.$anonfun$withAttributionContext$1(AttributionContextTracing.scala:49)&lt;/SPAN&gt;&lt;/DIV&gt;&lt;DIV&gt;&lt;SPAN&gt;at com.databricks.logging.AttributionContext$.$anonfun$withValue$1(AttributionContext.scala:293)&lt;/SPAN&gt;&lt;/DIV&gt;&lt;DIV&gt;&lt;SPAN&gt;at scala.util.DynamicVariable.withValue(DynamicVariable.scala:62)&lt;/SPAN&gt;&lt;/DIV&gt;&lt;DIV&gt;&lt;SPAN&gt;at com.databricks.logging.AttributionContext$.withValue(AttributionContext.scala:289)&lt;/SPAN&gt;&lt;/DIV&gt;&lt;DIV&gt;&lt;SPAN&gt;at com.databricks.logging.AttributionContextTracing.withAttributionContext(AttributionContextTracing.scala:47)&lt;/SPAN&gt;&lt;/DIV&gt;&lt;DIV&gt;&lt;SPAN&gt;at com.databricks.logging.AttributionContextTracing.withAttributionContext$(AttributionContextTracing.scala:44)&lt;/SPAN&gt;&lt;/DIV&gt;&lt;DIV&gt;&lt;SPAN&gt;at com.databricks.backend.daemon.driver.DriverWrapper.withAttributionContext(DriverWrapper.scala:81)&lt;/SPAN&gt;&lt;/DIV&gt;&lt;DIV&gt;&lt;SPAN&gt;at com.databricks.backend.daemon.driver.DriverWrapper.runInnerLoop(DriverWrapper.scala:625)&lt;/SPAN&gt;&lt;/DIV&gt;&lt;DIV&gt;&lt;SPAN&gt;at com.databricks.backend.daemon.driver.DriverWrapper.runInner(DriverWrapper.scala:548)&lt;/SPAN&gt;&lt;/DIV&gt;&lt;DIV&gt;&lt;SPAN&gt;at com.databricks.backend.daemon.driver.DriverWrapper.run(DriverWrapper.scala:373)&lt;/SPAN&gt;&lt;/DIV&gt;&lt;DIV&gt;&lt;SPAN&gt;at java.base/java.lang.Thread.run(Thread.java:840)&lt;/SPAN&gt;&lt;/DIV&gt;&lt;DIV&gt;&lt;SPAN&gt;Suppressed: org.apache.spark.util.Utils$OriginalTryStackTraceException: Full stacktrace of original doTryWithCallerStacktrace caller&lt;/SPAN&gt;&lt;/DIV&gt;&lt;DIV&gt;&lt;SPAN&gt;at org.apache.spark.sql.errors.QueryExecutionErrors$.cannotEstablishConnectionError(QueryExecutionErrors.scala:1278)&lt;/SPAN&gt;&lt;/DIV&gt;&lt;DIV&gt;&lt;SPAN&gt;at org.apache.spark.sql.execution.datasources.jdbc.JDBCDialectTestConnection.testCatalogConnection(JDBCTestConnection.scala:86)&lt;/SPAN&gt;&lt;/DIV&gt;&lt;DIV&gt;&lt;SPAN&gt;at com.databricks.sql.managedcatalog.command.QueryFederationCommand$.testCatalogConnection(queryFederationCommandsExec.scala:153)&lt;/SPAN&gt;&lt;/DIV&gt;&lt;DIV&gt;&lt;SPAN&gt;at com.databricks.sql.managedcatalog.command.TestForeignCatalogConnectionCommand.run(queryFederationCommandsExec.scala:302)&lt;/SPAN&gt;&lt;/DIV&gt;&lt;DIV&gt;&lt;SPAN&gt;at org.apache.spark.sql.execution.command.ExecutedCommandExec.$anonfun$sideEffectResult$2(commands.scala:84)&lt;/SPAN&gt;&lt;/DIV&gt;&lt;DIV&gt;&lt;SPAN&gt;at org.apache.spark.sql.execution.SparkPlan.runCommandInAetherOrSpark(SparkPlan.scala:194)&lt;/SPAN&gt;&lt;/DIV&gt;&lt;DIV&gt;&lt;SPAN&gt;at org.apache.spark.sql.execution.command.ExecutedCommandExec.$anonfun$sideEffectResult$1(commands.scala:84)&lt;/SPAN&gt;&lt;/DIV&gt;&lt;DIV&gt;&lt;SPAN&gt;at com.databricks.spark.util.FrameProfiler$.record(FrameProfiler.scala:94)&lt;/SPAN&gt;&lt;/DIV&gt;&lt;DIV&gt;&lt;SPAN&gt;at org.apache.spark.sql.execution.command.ExecutedCommandExec.sideEffectResult$lzycompute(commands.scala:81)&lt;/SPAN&gt;&lt;/DIV&gt;&lt;DIV&gt;&lt;SPAN&gt;at org.apache.spark.sql.execution.command.ExecutedCommandExec.sideEffectResult(commands.scala:80)&lt;/SPAN&gt;&lt;/DIV&gt;&lt;DIV&gt;&lt;SPAN&gt;at org.apache.spark.sql.execution.command.ExecutedCommandExec.executeCollect(commands.scala:94)&lt;/SPAN&gt;&lt;/DIV&gt;&lt;DIV&gt;&lt;SPAN&gt;at org.apache.spark.sql.execution.QueryExecution.$anonfun$eagerlyExecuteCommands$5(QueryExecution.scala:553)&lt;/SPAN&gt;&lt;/DIV&gt;&lt;DIV&gt;&lt;SPAN&gt;at com.databricks.util.LexicalThreadLocal$Handle.runWith(LexicalThreadLocal.scala:63)&lt;/SPAN&gt;&lt;/DIV&gt;&lt;DIV&gt;&lt;SPAN&gt;at org.apache.spark.sql.execution.QueryExecution.$anonfun$eagerlyExecuteCommands$4(QueryExecution.scala:553)&lt;/SPAN&gt;&lt;/DIV&gt;&lt;DIV&gt;&lt;SPAN&gt;at org.apache.spark.sql.catalyst.QueryPlanningTracker$.withTracker(QueryPlanningTracker.scala:233)&lt;/SPAN&gt;&lt;/DIV&gt;&lt;DIV&gt;&lt;SPAN&gt;at org.apache.spark.sql.execution.QueryExecution.$anonfun$eagerlyExecuteCommands$3(QueryExecution.scala:552)&lt;/SPAN&gt;&lt;/DIV&gt;&lt;DIV&gt;&lt;SPAN&gt;at org.apache.spark.sql.execution.SQLExecution$.$anonfun$withNewExecutionId0$11(SQLExecution.scala:486)&lt;/SPAN&gt;&lt;/DIV&gt;&lt;DIV&gt;&lt;SPAN&gt;at com.databricks.sql.util.MemoryTrackerHelper.withMemoryTracking(MemoryTrackerHelper.scala:80)&lt;/SPAN&gt;&lt;/DIV&gt;&lt;DIV&gt;&lt;SPAN&gt;at org.apache.spark.sql.execution.SQLExecution$.$anonfun$withNewExecutionId0$10(SQLExecution.scala:422)&lt;/SPAN&gt;&lt;/DIV&gt;&lt;DIV&gt;&lt;SPAN&gt;at org.apache.spark.sql.execution.SQLExecution$.withSQLConfPropagated(SQLExecution.scala:810)&lt;/SPAN&gt;&lt;/DIV&gt;&lt;DIV&gt;&lt;SPAN&gt;at org.apache.spark.sql.execution.SQLExecution$.$anonfun$withNewExecutionId0$1(SQLExecution.scala:352)&lt;/SPAN&gt;&lt;/DIV&gt;&lt;DIV&gt;&lt;SPAN&gt;at org.apache.spark.sql.SparkSession.withActive(SparkSession.scala:1481)&lt;/SPAN&gt;&lt;/DIV&gt;&lt;DIV&gt;&lt;SPAN&gt;at org.apache.spark.sql.execution.SQLExecution$.withNewExecutionId0(SQLExecution.scala:217)&lt;/SPAN&gt;&lt;/DIV&gt;&lt;DIV&gt;&lt;SPAN&gt;at org.apache.spark.sql.execution.SQLExecution$.withNewExecutionId(SQLExecution.scala:747)&lt;/SPAN&gt;&lt;/DIV&gt;&lt;DIV&gt;&lt;SPAN&gt;at org.apache.spark.sql.execution.QueryExecution.$anonfun$eagerlyExecuteCommands$2(QueryExecution.scala:548)&lt;/SPAN&gt;&lt;/DIV&gt;&lt;DIV&gt;&lt;SPAN&gt;at org.apache.spark.sql.execution.QueryExecution$.withInternalError(QueryExecution.scala:1426)&lt;/SPAN&gt;&lt;/DIV&gt;&lt;DIV&gt;&lt;SPAN&gt;at org.apache.spark.sql.execution.QueryExecution.$anonfun$eagerlyExecuteCommands$1(QueryExecution.scala:544)&lt;/SPAN&gt;&lt;/DIV&gt;&lt;DIV&gt;&lt;SPAN&gt;at org.apache.spark.sql.execution.QueryExecution.withMVTagsIfNecessary(QueryExecution.scala:473)&lt;/SPAN&gt;&lt;/DIV&gt;&lt;DIV&gt;&lt;SPAN&gt;at org.apache.spark.sql.execution.QueryExecution.org$apache$spark$sql$execution$QueryExecution$$eagerlyExecute$1(QueryExecution.scala:542)&lt;/SPAN&gt;&lt;/DIV&gt;&lt;DIV&gt;&lt;SPAN&gt;at org.apache.spark.sql.execution.QueryExecution$$anonfun$$nestedInanonfun$eagerlyExecuteCommands$8$1.applyOrElse(QueryExecution.scala:621)&lt;/SPAN&gt;&lt;/DIV&gt;&lt;DIV&gt;&lt;SPAN&gt;at org.apache.spark.sql.execution.QueryExecution$$anonfun$$nestedInanonfun$eagerlyExecuteCommands$8$1.applyOrElse(QueryExecution.scala:616)&lt;/SPAN&gt;&lt;/DIV&gt;&lt;DIV&gt;&lt;SPAN&gt;at org.apache.spark.sql.catalyst.trees.TreeNode.$anonfun$transformDownWithPruning$1(TreeNode.scala:521)&lt;/SPAN&gt;&lt;/DIV&gt;&lt;DIV&gt;&lt;SPAN&gt;at org.apache.spark.sql.catalyst.trees.CurrentOrigin$.withOrigin(origin.scala:85)&lt;/SPAN&gt;&lt;/DIV&gt;&lt;DIV&gt;&lt;SPAN&gt;at org.apache.spark.sql.catalyst.trees.TreeNode.transformDownWithPruning(TreeNode.scala:521)&lt;/SPAN&gt;&lt;/DIV&gt;&lt;DIV&gt;&lt;SPAN&gt;at org.apache.spark.sql.catalyst.plans.logical.LogicalPlan.org$apache$spark$sql$catalyst$plans$logical$AnalysisHelper$$super$transformDownWithPruning(LogicalPlan.scala:42)&lt;/SPAN&gt;&lt;/DIV&gt;&lt;DIV&gt;&lt;SPAN&gt;at org.apache.spark.sql.catalyst.plans.logical.AnalysisHelper.transformDownWithPruning(AnalysisHelper.scala:361)&lt;/SPAN&gt;&lt;/DIV&gt;&lt;DIV&gt;&lt;SPAN&gt;at org.apache.spark.sql.catalyst.plans.logical.AnalysisHelper.transformDownWithPruning$(AnalysisHelper.scala:357)&lt;/SPAN&gt;&lt;/DIV&gt;&lt;DIV&gt;&lt;SPAN&gt;at org.apache.spark.sql.catalyst.plans.logical.LogicalPlan.transformDownWithPruning(LogicalPlan.scala:42)&lt;/SPAN&gt;&lt;/DIV&gt;&lt;DIV&gt;&lt;SPAN&gt;at org.apache.spark.sql.catalyst.plans.logical.LogicalPlan.transformDownWithPruning(LogicalPlan.scala:42)&lt;/SPAN&gt;&lt;/DIV&gt;&lt;DIV&gt;&lt;SPAN&gt;at org.apache.spark.sql.catalyst.trees.TreeNode.transformDown(TreeNode.scala:497)&lt;/SPAN&gt;&lt;/DIV&gt;&lt;DIV&gt;&lt;SPAN&gt;at org.apache.spark.sql.execution.QueryExecution.$anonfun$eagerlyExecuteCommands$8(QueryExecution.scala:616)&lt;/SPAN&gt;&lt;/DIV&gt;&lt;DIV&gt;&lt;SPAN&gt;at org.apache.spark.sql.catalyst.plans.logical.AnalysisHelper$.allowInvokingTransformsInAnalyzer(AnalysisHelper.scala:418)&lt;/SPAN&gt;&lt;/DIV&gt;&lt;DIV&gt;&lt;SPAN&gt;at org.apache.spark.sql.execution.QueryExecution.eagerlyExecuteCommands(QueryExecution.scala:616)&lt;/SPAN&gt;&lt;/DIV&gt;&lt;DIV&gt;&lt;SPAN&gt;at org.apache.spark.sql.execution.QueryExecution.$anonfun$lazyCommandExecuted$1(QueryExecution.scala:429)&lt;/SPAN&gt;&lt;/DIV&gt;&lt;DIV&gt;&lt;SPAN&gt;at scala.util.Try$.apply(Try.scala:213)&lt;/SPAN&gt;&lt;/DIV&gt;&lt;DIV&gt;&lt;SPAN&gt;at org.apache.spark.util.Utils$.doTryWithCallerStacktrace(Utils.scala:1684)&lt;/SPAN&gt;&lt;/DIV&gt;&lt;DIV&gt;&lt;SPAN&gt;at org.apache.spark.util.LazyTry.tryT$lzycompute(LazyTry.scala:46)&lt;/SPAN&gt;&lt;/DIV&gt;&lt;DIV&gt;&lt;SPAN&gt;at org.apache.spark.util.LazyTry.tryT(LazyTry.scala:46)&lt;/SPAN&gt;&lt;/DIV&gt;&lt;DIV&gt;&lt;SPAN&gt;... 71 more&lt;/SPAN&gt;&lt;/DIV&gt;&lt;DIV&gt;Caused by: java.sql.SQLRecoverableException: ORA-17002: I/O error: Socket read interrupted, Authentication lapse 0 ms.&lt;/DIV&gt;&lt;DIV&gt;&lt;A href="https://docs.oracle.com/error-help/db/ora-17002/" target="_blank"&gt;https://docs.oracle.com/error-help/db/ora-17002/&lt;/A&gt;&lt;/DIV&gt;&lt;DIV&gt;&lt;SPAN&gt;at shaded.databricks.oracle.jdbc.driver.T4CConnection.handleLogonIOException(T4CConnection.java:1644)&lt;/SPAN&gt;&lt;/DIV&gt;&lt;DIV&gt;&lt;SPAN&gt;at shaded.databricks.oracle.jdbc.driver.T4CConnection.handleLogonInterruptedIOException(T4CConnection.java:1505)&lt;/SPAN&gt;&lt;/DIV&gt;&lt;DIV&gt;&lt;SPAN&gt;at shaded.databricks.oracle.jdbc.driver.T4CConnection.logon(T4CConnection.java:1118)&lt;/SPAN&gt;&lt;/DIV&gt;&lt;DIV&gt;&lt;SPAN&gt;at shaded.databricks.oracle.jdbc.driver.PhysicalConnection.connect(PhysicalConnection.java:1175)&lt;/SPAN&gt;&lt;/DIV&gt;&lt;DIV&gt;&lt;SPAN&gt;at shaded.databricks.oracle.jdbc.driver.T4CDriverExtension.getConnection(T4CDriverExtension.java:105)&lt;/SPAN&gt;&lt;/DIV&gt;&lt;DIV&gt;&lt;SPAN&gt;at shaded.databricks.oracle.jdbc.driver.OracleDriver.connect(OracleDriver.java:886)&lt;/SPAN&gt;&lt;/DIV&gt;&lt;DIV&gt;&lt;SPAN&gt;at shaded.databricks.oracle.jdbc.driver.OracleDriver.connect(OracleDriver.java:693)&lt;/SPAN&gt;&lt;/DIV&gt;&lt;DIV&gt;&lt;SPAN&gt;at org.apache.spark.sql.execution.datasources.jdbc.connection.BasicConnectionProvider.getConnection(BasicConnectionProvider.scala:50)&lt;/SPAN&gt;&lt;/DIV&gt;&lt;DIV&gt;&lt;SPAN&gt;at org.apache.spark.sql.execution.datasources.jdbc.connection.ConnectionProviderBase.create(ConnectionProvider.scala:102)&lt;/SPAN&gt;&lt;/DIV&gt;&lt;DIV&gt;&lt;SPAN&gt;at org.apache.spark.sql.jdbc.JdbcDialect.$anonfun$createConnectionFactory$1(JdbcDialects.scala:270)&lt;/SPAN&gt;&lt;/DIV&gt;&lt;DIV&gt;&lt;SPAN&gt;at org.apache.spark.sql.jdbc.JdbcDialect.$anonfun$createConnectionFactory$1$adapted(JdbcDialects.scala:266)&lt;/SPAN&gt;&lt;/DIV&gt;&lt;DIV&gt;&lt;SPAN&gt;at org.apache.spark.sql.execution.datasources.jdbc.JdbcUtils$.withConnection(JdbcUtils.scala:1373)&lt;/SPAN&gt;&lt;/DIV&gt;&lt;DIV&gt;&lt;SPAN&gt;at org.apache.spark.sql.execution.datasources.jdbc.JDBCDialectTestConnection.testCatalogConnection(JDBCTestConnection.scala:83)&lt;/SPAN&gt;&lt;/DIV&gt;&lt;DIV&gt;&lt;SPAN&gt;at com.databricks.sql.managedcatalog.command.QueryFederationCommand$.testCatalogConnection(queryFederationCommandsExec.scala:153)&lt;/SPAN&gt;&lt;/DIV&gt;&lt;DIV&gt;&lt;SPAN&gt;at com.databricks.sql.managedcatalog.command.TestForeignCatalogConnectionCommand.run(queryFederationCommandsExec.scala:302)&lt;/SPAN&gt;&lt;/DIV&gt;&lt;DIV&gt;&lt;SPAN&gt;at org.apache.spark.sql.execution.command.ExecutedCommandExec.$anonfun$sideEffectResult$2(commands.scala:84)&lt;/SPAN&gt;&lt;/DIV&gt;&lt;DIV&gt;&lt;SPAN&gt;at org.apache.spark.sql.execution.SparkPlan.runCommandInAetherOrSpark(SparkPlan.scala:194)&lt;/SPAN&gt;&lt;/DIV&gt;&lt;DIV&gt;&lt;SPAN&gt;at org.apache.spark.sql.execution.command.ExecutedCommandExec.$anonfun$sideEffectResult$1(commands.scala:84)&lt;/SPAN&gt;&lt;/DIV&gt;&lt;DIV&gt;&lt;SPAN&gt;at com.databricks.spark.util.FrameProfiler$.record(FrameProfiler.scala:94)&lt;/SPAN&gt;&lt;/DIV&gt;&lt;DIV&gt;&lt;SPAN&gt;at org.apache.spark.sql.execution.command.ExecutedCommandExec.sideEffectResult$lzycompute(commands.scala:81)&lt;/SPAN&gt;&lt;/DIV&gt;&lt;DIV&gt;&lt;SPAN&gt;at org.apache.spark.sql.execution.command.ExecutedCommandExec.sideEffectResult(commands.scala:80)&lt;/SPAN&gt;&lt;/DIV&gt;&lt;DIV&gt;&lt;SPAN&gt;at org.apache.spark.sql.execution.command.ExecutedCommandExec.executeCollect(commands.scala:94)&lt;/SPAN&gt;&lt;/DIV&gt;&lt;DIV&gt;&lt;SPAN&gt;at org.apache.spark.sql.execution.QueryExecution.$anonfun$eagerlyExecuteCommands$5(QueryExecution.scala:553)&lt;/SPAN&gt;&lt;/DIV&gt;&lt;DIV&gt;&lt;SPAN&gt;at com.databricks.util.LexicalThreadLocal$Handle.runWith(LexicalThreadLocal.scala:63)&lt;/SPAN&gt;&lt;/DIV&gt;&lt;DIV&gt;&lt;SPAN&gt;at org.apache.spark.sql.execution.QueryExecution.$anonfun$eagerlyExecuteCommands$4(QueryExecution.scala:553)&lt;/SPAN&gt;&lt;/DIV&gt;&lt;DIV&gt;&lt;SPAN&gt;at org.apache.spark.sql.catalyst.QueryPlanningTracker$.withTracker(QueryPlanningTracker.scala:233)&lt;/SPAN&gt;&lt;/DIV&gt;&lt;DIV&gt;&lt;SPAN&gt;at org.apache.spark.sql.execution.QueryExecution.$anonfun$eagerlyExecuteCommands$3(QueryExecution.scala:552)&lt;/SPAN&gt;&lt;/DIV&gt;&lt;DIV&gt;&lt;SPAN&gt;at org.apache.spark.sql.execution.SQLExecution$.$anonfun$withNewExecutionId0$11(SQLExecution.scala:486)&lt;/SPAN&gt;&lt;/DIV&gt;&lt;DIV&gt;&lt;SPAN&gt;at com.databricks.sql.util.MemoryTrackerHelper.withMemoryTracking(MemoryTrackerHelper.scala:80)&lt;/SPAN&gt;&lt;/DIV&gt;&lt;DIV&gt;&lt;SPAN&gt;at org.apache.spark.sql.execution.SQLExecution$.$anonfun$withNewExecutionId0$10(SQLExecution.scala:422)&lt;/SPAN&gt;&lt;/DIV&gt;&lt;DIV&gt;&lt;SPAN&gt;at org.apache.spark.sql.execution.SQLExecution$.withSQLConfPropagated(SQLExecution.scala:810)&lt;/SPAN&gt;&lt;/DIV&gt;&lt;DIV&gt;&lt;SPAN&gt;at org.apache.spark.sql.execution.SQLExecution$.$anonfun$withNewExecutionId0$1(SQLExecution.scala:352)&lt;/SPAN&gt;&lt;/DIV&gt;&lt;DIV&gt;&lt;SPAN&gt;at org.apache.spark.sql.SparkSession.withActive(SparkSession.scala:1481)&lt;/SPAN&gt;&lt;/DIV&gt;&lt;DIV&gt;&lt;SPAN&gt;at org.apache.spark.sql.execution.SQLExecution$.withNewExecutionId0(SQLExecution.scala:217)&lt;/SPAN&gt;&lt;/DIV&gt;&lt;DIV&gt;&lt;SPAN&gt;at org.apache.spark.sql.execution.SQLExecution$.withNewExecutionId(SQLExecution.scala:747)&lt;/SPAN&gt;&lt;/DIV&gt;&lt;DIV&gt;&lt;SPAN&gt;at org.apache.spark.sql.execution.QueryExecution.$anonfun$eagerlyExecuteCommands$2(QueryExecution.scala:548)&lt;/SPAN&gt;&lt;/DIV&gt;&lt;DIV&gt;&lt;SPAN&gt;at org.apache.spark.sql.execution.QueryExecution$.withInternalError(QueryExecution.scala:1426)&lt;/SPAN&gt;&lt;/DIV&gt;&lt;DIV&gt;&lt;SPAN&gt;at org.apache.spark.sql.execution.QueryExecution.$anonfun$eagerlyExecuteCommands$1(QueryExecution.scala:544)&lt;/SPAN&gt;&lt;/DIV&gt;&lt;DIV&gt;&lt;SPAN&gt;at org.apache.spark.sql.execution.QueryExecution.withMVTagsIfNecessary(QueryExecution.scala:473)&lt;/SPAN&gt;&lt;/DIV&gt;&lt;DIV&gt;&lt;SPAN&gt;at org.apache.spark.sql.execution.QueryExecution.org$apache$spark$sql$execution$QueryExecution$$eagerlyExecute$1(QueryExecution.scala:542)&lt;/SPAN&gt;&lt;/DIV&gt;&lt;DIV&gt;&lt;SPAN&gt;at org.apache.spark.sql.execution.QueryExecution$$anonfun$$nestedInanonfun$eagerlyExecuteCommands$8$1.applyOrElse(QueryExecution.scala:621)&lt;/SPAN&gt;&lt;/DIV&gt;&lt;DIV&gt;&lt;SPAN&gt;at org.apache.spark.sql.execution.QueryExecution$$anonfun$$nestedInanonfun$eagerlyExecuteCommands$8$1.applyOrElse(QueryExecution.scala:616)&lt;/SPAN&gt;&lt;/DIV&gt;&lt;DIV&gt;&lt;SPAN&gt;at org.apache.spark.sql.catalyst.trees.TreeNode.$anonfun$transformDownWithPruning$1(TreeNode.scala:521)&lt;/SPAN&gt;&lt;/DIV&gt;&lt;DIV&gt;&lt;SPAN&gt;at org.apache.spark.sql.catalyst.trees.CurrentOrigin$.withOrigin(origin.scala:85)&lt;/SPAN&gt;&lt;/DIV&gt;&lt;DIV&gt;&lt;SPAN&gt;at org.apache.spark.sql.catalyst.trees.TreeNode.transformDownWithPruning(TreeNode.scala:521)&lt;/SPAN&gt;&lt;/DIV&gt;&lt;DIV&gt;&lt;SPAN&gt;at org.apache.spark.sql.catalyst.plans.logical.LogicalPlan.org$apache$spark$sql$catalyst$plans$logical$AnalysisHelper$$super$transformDownWithPruning(LogicalPlan.scala:42)&lt;/SPAN&gt;&lt;/DIV&gt;&lt;DIV&gt;&lt;SPAN&gt;at org.apache.spark.sql.catalyst.plans.logical.AnalysisHelper.transformDownWithPruning(AnalysisHelper.scala:361)&lt;/SPAN&gt;&lt;/DIV&gt;&lt;DIV&gt;&lt;SPAN&gt;at org.apache.spark.sql.catalyst.plans.logical.AnalysisHelper.transformDownWithPruning$(AnalysisHelper.scala:357)&lt;/SPAN&gt;&lt;/DIV&gt;&lt;DIV&gt;&lt;SPAN&gt;at org.apache.spark.sql.catalyst.plans.logical.LogicalPlan.transformDownWithPruning(LogicalPlan.scala:42)&lt;/SPAN&gt;&lt;/DIV&gt;&lt;DIV&gt;&lt;SPAN&gt;at org.apache.spark.sql.catalyst.plans.logical.LogicalPlan.transformDownWithPruning(LogicalPlan.scala:42)&lt;/SPAN&gt;&lt;/DIV&gt;&lt;DIV&gt;&lt;SPAN&gt;at org.apache.spark.sql.catalyst.trees.TreeNode.transformDown(TreeNode.scala:497)&lt;/SPAN&gt;&lt;/DIV&gt;&lt;DIV&gt;&lt;SPAN&gt;at org.apache.spark.sql.execution.QueryExecution.$anonfun$eagerlyExecuteCommands$8(QueryExecution.scala:616)&lt;/SPAN&gt;&lt;/DIV&gt;&lt;DIV&gt;&lt;SPAN&gt;at org.apache.spark.sql.catalyst.plans.logical.AnalysisHelper$.allowInvokingTransformsInAnalyzer(AnalysisHelper.scala:418)&lt;/SPAN&gt;&lt;/DIV&gt;&lt;DIV&gt;&lt;SPAN&gt;at org.apache.spark.sql.execution.QueryExecution.eagerlyExecuteCommands(QueryExecution.scala:616)&lt;/SPAN&gt;&lt;/DIV&gt;&lt;DIV&gt;&lt;SPAN&gt;at org.apache.spark.sql.execution.QueryExecution.$anonfun$lazyCommandExecuted$1(QueryExecution.scala:429)&lt;/SPAN&gt;&lt;/DIV&gt;&lt;DIV&gt;&lt;SPAN&gt;at scala.util.Try$.apply(Try.scala:213)&lt;/SPAN&gt;&lt;/DIV&gt;&lt;DIV&gt;&lt;SPAN&gt;at org.apache.spark.util.Utils$.doTryWithCallerStacktrace(Utils.scala:1684)&lt;/SPAN&gt;&lt;/DIV&gt;&lt;DIV&gt;&lt;SPAN&gt;at org.apache.spark.util.LazyTry.tryT$lzycompute(LazyTry.scala:46)&lt;/SPAN&gt;&lt;/DIV&gt;&lt;DIV&gt;&lt;SPAN&gt;at org.apache.spark.util.LazyTry.tryT(LazyTry.scala:46)&lt;/SPAN&gt;&lt;/DIV&gt;&lt;DIV&gt;&lt;SPAN&gt;... 71 more&lt;/SPAN&gt;&lt;/DIV&gt;&lt;DIV&gt;Caused by: java.io.IOException: Socket read interrupted, Authentication lapse 0 ms.&lt;/DIV&gt;&lt;DIV&gt;&lt;SPAN&gt;at shaded.databricks.oracle.jdbc.driver.T4CConnection.handleLogonIOException(T4CConnection.java:1639)&lt;/SPAN&gt;&lt;/DIV&gt;&lt;DIV&gt;&lt;SPAN&gt;... 129 more&lt;/SPAN&gt;&lt;/DIV&gt;&lt;DIV&gt;Caused by: java.io.InterruptedIOException: Socket read interrupted&lt;/DIV&gt;&lt;DIV&gt;&lt;SPAN&gt;at shaded.databricks.oracle.net.nt.TimeoutSocketChannel.doBlockedRead(TimeoutSocketChannel.java:612)&lt;/SPAN&gt;&lt;/DIV&gt;&lt;DIV&gt;&lt;SPAN&gt;at shaded.databricks.oracle.net.nt.TimeoutSocketChannel.read(TimeoutSocketChannel.java:536)&lt;/SPAN&gt;&lt;/DIV&gt;&lt;DIV&gt;&lt;SPAN&gt;at shaded.databricks.oracle.net.ns.NSProtocolNIO.doSocketRead(NSProtocolNIO.java:1224)&lt;/SPAN&gt;&lt;/DIV&gt;&lt;DIV&gt;&lt;SPAN&gt;at shaded.databricks.oracle.net.ns.NIOPacket.readNIOPacket(NIOPacket.java:436)&lt;/SPAN&gt;&lt;/DIV&gt;&lt;DIV&gt;&lt;SPAN&gt;at shaded.databricks.oracle.net.ns.NSProtocolNIO.negotiateConnection(NSProtocolNIO.java:216)&lt;/SPAN&gt;&lt;/DIV&gt;&lt;DIV&gt;&lt;SPAN&gt;at shaded.databricks.oracle.net.ns.NSProtocol.connect(NSProtocol.java:352)&lt;/SPAN&gt;&lt;/DIV&gt;&lt;DIV&gt;&lt;SPAN&gt;at shaded.databricks.oracle.jdbc.driver.T4CConnection.connectNetworkSessionProtocol(T4CConnection.java:3180)&lt;/SPAN&gt;&lt;/DIV&gt;&lt;DIV&gt;&lt;SPAN&gt;at shaded.databricks.oracle.jdbc.driver.T4CConnection.logon(T4CConnection.java:1002)&lt;/SPAN&gt;&lt;/DIV&gt;&lt;DIV&gt;&lt;SPAN&gt;... 127 more&lt;/SPAN&gt;&lt;/DIV&gt;&lt;DIV&gt;26/02/03 15:19:41 INFO ProgressReporter$: Removed result fetcher for 1770130388085_8294809462390751836_c30cf945-7a17-445a-be87-4f94a23a64f9&lt;/DIV&gt;&lt;DIV&gt;26/02/03 15:19:41 INFO QueryProfileListener: Query profile sent to logger, seq number: 9, app id: app-20260203145333-0000&lt;/DIV&gt;</description>
      <pubDate>Tue, 03 Feb 2026 15:34:21 GMT</pubDate>
      <guid>https://community.databricks.com/t5/data-engineering/getting-quot-socket-closed-quot-with-query-federation-to-oracle/m-p/146738#M52694</guid>
      <dc:creator>luketl2</dc:creator>
      <dc:date>2026-02-03T15:34:21Z</dc:date>
    </item>
    <item>
      <title>Re: Getting "socket closed" with query federation to Oracle DB on Amazon RDS</title>
      <link>https://community.databricks.com/t5/data-engineering/getting-quot-socket-closed-quot-with-query-federation-to-oracle/m-p/146756#M52700</link>
      <description>&lt;P&gt;I also tried this using ojdbc driver from Maven instead of federation, got this stack trace in log4j:&lt;BR /&gt;&lt;BR /&gt;&lt;/P&gt;&lt;DIV&gt;&lt;DIV class=""&gt;&lt;SPAN class=""&gt;Py4JJavaError: &lt;/SPAN&gt;An error occurred while calling o503.load. : java.sql.SQLRecoverableException: IO Error: Socket read timed out, Authentication lapse 0 ms. at oracle.jdbc.driver.T4CConnection.logon(T4CConnection.java:874) at oracle.jdbc.driver.PhysicalConnection.connect(PhysicalConnection.java:793) at oracle.jdbc.driver.T4CDriverExtension.getConnection(T4CDriverExtension.java:57) at oracle.jdbc.driver.OracleDriver.connect(OracleDriver.java:747) at oracle.jdbc.driver.OracleDriver.connect(OracleDriver.java:562) at org.apache.spark.sql.execution.datasources.jdbc.connection.BasicConnectionProvider.getConnection(BasicConnectionProvider.scala:50) at org.apache.spark.sql.execution.datasources.jdbc.connection.ConnectionProviderBase.create(ConnectionProvider.scala:102) at org.apache.spark.sql.jdbc.JdbcDialect.$anonfun$createConnectionFactory$2(JdbcDialects.scala:269) at org.apache.spark.sql.catalyst.MetricKeyUtils$.measure(MetricKey.scala:2259) at org.apache.spark.sql.jdbc.JdbcDialect.$anonfun$createConnectionFactory$1(JdbcDialects.scala:269) at org.apache.spark.sql.jdbc.JdbcDialect.$anonfun$createConnectionFactory$1$adapted(JdbcDialects.scala:265) at org.apache.spark.sql.execution.datasources.jdbc.JdbcUtils$.withConnection(JdbcUtils.scala:1383) at org.apache.spark.sql.execution.datasources.jdbc.JDBCRDD$.resolveTable(JDBCRDD.scala:90) at org.apache.spark.sql.execution.datasources.jdbc.JDBCRelation$.getSchema(JDBCRelation.scala:256) at org.apache.spark.sql.execution.datasources.jdbc.JdbcRelationProvider.$anonfun$createRelation$1(JdbcRelationProvider.scala:50) at org.apache.spark.sql.execution.metric.SQLMetrics$.withTimingNs(SQLMetrics.scala:493) at org.apache.spark.sql.execution.datasources.jdbc.JdbcRelationProvider.createRelation(JdbcRelationProvider.scala:50) at org.apache.spark.sql.execution.datasources.DataSource.resolveRelation(DataSource.scala:455) at org.apache.spark.sql.catalyst.analysis.ResolveDataSource.org$apache$spark$sql$catalyst$analysis$ResolveDataSource$$loadV1BatchSource(ResolveDataSource.scala:277) at org.apache.spark.sql.catalyst.analysis.ResolveDataSource$$anonfun$apply$1.$anonfun$applyOrElse$5(ResolveDataSource.scala:114) at scala.Option.getOrElse(Option.scala:201) at org.apache.spark.sql.catalyst.analysis.ResolveDataSource$$anonfun$apply$1.applyOrElse(ResolveDataSource.scala:113) at org.apache.spark.sql.catalyst.analysis.ResolveDataSource$$anonfun$apply$1.applyOrElse(ResolveDataSource.scala:60) at org.apache.spark.sql.catalyst.plans.logical.AnalysisHelper.$anonfun$resolveOperatorsUpWithPruning$3(AnalysisHelper.scala:141) at org.apache.spark.sql.catalyst.trees.CurrentOrigin$.withOrigin(origin.scala:142) at org.apache.spark.sql.catalyst.plans.logical.AnalysisHelper.$anonfun$resolveOperatorsUpWithPruning$1(AnalysisHelper.scala:141) at org.apache.spark.sql.catalyst.plans.logical.AnalysisHelper$.allowInvokingTransformsInAnalyzer(AnalysisHelper.scala:418) at org.apache.spark.sql.catalyst.plans.logical.AnalysisHelper.resolveOperatorsUpWithPruning(AnalysisHelper.scala:137) at org.apache.spark.sql.catalyst.plans.logical.AnalysisHelper.resolveOperatorsUpWithPruning$(AnalysisHelper.scala:133) at org.apache.spark.sql.catalyst.plans.logical.LogicalPlan.resolveOperatorsUpWithPruning(LogicalPlan.scala:47) at org.apache.spark.sql.catalyst.plans.logical.AnalysisHelper.resolveOperatorsUp(AnalysisHelper.scala:114) at org.apache.spark.sql.catalyst.plans.logical.AnalysisHelper.resolveOperatorsUp$(AnalysisHelper.scala:113) at org.apache.spark.sql.catalyst.plans.logical.LogicalPlan.resolveOperatorsUp(LogicalPlan.scala:47) at org.apache.spark.sql.catalyst.analysis.ResolveDataSource.apply(ResolveDataSource.scala:60) at org.apache.spark.sql.catalyst.analysis.ResolveDataSource.apply(ResolveDataSource.scala:58) at org.apache.spark.sql.catalyst.rules.RuleExecutor.$anonfun$execute$17(RuleExecutor.scala:509) at org.apache.spark.sql.catalyst.rules.RecoverableRuleExecutionHelper.processRule(RuleExecutor.scala:663) at org.apache.spark.sql.catalyst.rules.RecoverableRuleExecutionHelper.processRule$(RuleExecutor.scala:647) at org.apache.spark.sql.catalyst.rules.RuleExecutor.processRule(RuleExecutor.scala:143) at org.apache.spark.sql.catalyst.rules.RuleExecutor.$anonfun$execute$16(RuleExecutor.scala:509) at com.databricks.spark.util.MemoryTracker$.withThreadAllocatedBytes(MemoryTracker.scala:51) at org.apache.spark.sql.catalyst.QueryPlanningTracker$.measureRule(QueryPlanningTracker.scala:382) at org.apache.spark.sql.catalyst.rules.RuleExecutor.$anonfun$execute$15(RuleExecutor.scala:507) at com.databricks.spark.util.FrameProfiler$.$anonfun$record$1(FrameProfiler.scala:114) at com.databricks.spark.util.FrameProfilerExporter$.maybeExportFrameProfiler(FrameProfilerExporter.scala:200) at com.databricks.spark.util.FrameProfiler$.record(FrameProfiler.scala:105) at org.apache.spark.sql.catalyst.rules.RuleExecutor.$anonfun$execute$14(RuleExecutor.scala:506) at scala.collection.LinearSeqOps.foldLeft(LinearSeq.scala:183) at scala.collection.LinearSeqOps.foldLeft$(LinearSeq.scala:179) at scala.collection.immutable.List.foldLeft(List.scala:79) at org.apache.spark.sql.catalyst.rules.RuleExecutor.$anonfun$execute$13(RuleExecutor.scala:498) at scala.runtime.java8.JFunction0$mcV$sp.apply(JFunction0$mcV$sp.scala:18) at com.databricks.spark.util.FrameProfiler$.$anonfun$record$1(FrameProfiler.scala:114) at com.databricks.spark.util.FrameProfilerExporter$.maybeExportFrameProfiler(FrameProfilerExporter.scala:200) at com.databricks.spark.util.FrameProfiler$.record(FrameProfiler.scala:105) at org.apache.spark.sql.catalyst.rules.RuleExecutor.executeBatch$1(RuleExecutor.scala:472) at org.apache.spark.sql.catalyst.rules.RuleExecutor.$anonfun$execute$23(RuleExecutor.scala:619) at org.apache.spark.sql.catalyst.rules.RuleExecutor.$anonfun$execute$23$adapted(RuleExecutor.scala:619) at scala.collection.immutable.List.foreach(List.scala:334) at org.apache.spark.sql.catalyst.rules.RuleExecutor.$anonfun$execute$1(RuleExecutor.scala:619) at com.databricks.spark.util.FrameProfiler$.$anonfun$record$1(FrameProfiler.scala:114) at com.databricks.spark.util.FrameProfilerExporter$.maybeExportFrameProfiler(FrameProfilerExporter.scala:200) at com.databricks.spark.util.FrameProfiler$.record(FrameProfiler.scala:105) at org.apache.spark.sql.catalyst.rules.RuleExecutor.execute(RuleExecutor.scala:365) at org.apache.spark.sql.catalyst.analysis.Analyzer.super$execute(Analyzer.scala:734) at org.apache.spark.sql.catalyst.analysis.Analyzer.$anonfun$executeSameContext$1(Analyzer.scala:734) at com.databricks.sql.unity.SAMSnapshotHelper$.visitPlansDuringAnalysis(SAMSnapshotHelper.scala:40) at org.apache.spark.sql.catalyst.analysis.Analyzer.executeSameContext(Analyzer.scala:733) at org.apache.spark.sql.catalyst.analysis.Analyzer.$anonfun$execute$1(Analyzer.scala:706) at org.apache.spark.sql.catalyst.analysis.AnalysisContext$.withNewAnalysisContext(Analyzer.scala:477) at org.apache.spark.sql.catalyst.analysis.Analyzer.execute(Analyzer.scala:706) at org.apache.spark.sql.catalyst.analysis.Analyzer.execute(Analyzer.scala:629) at org.apache.spark.sql.catalyst.rules.RuleExecutor.$anonfun$executeAndTrack$1(RuleExecutor.scala:353) at org.apache.spark.sql.catalyst.QueryPlanningTracker$.withTracker(QueryPlanningTracker.scala:265) at org.apache.spark.sql.catalyst.rules.RuleExecutor.executeAndTrack(RuleExecutor.scala:353) at org.apache.spark.sql.catalyst.analysis.resolver.HybridAnalyzer.resolveInFixedPoint(HybridAnalyzer.scala:438) at org.apache.spark.sql.catalyst.analysis.resolver.HybridAnalyzer.$anonfun$apply$1(HybridAnalyzer.scala:97) at org.apache.spark.sql.catalyst.analysis.resolver.HybridAnalyzer.withTrackedAnalyzerBridgeState(HybridAnalyzer.scala:134) at org.apache.spark.sql.catalyst.analysis.resolver.HybridAnalyzer.apply(HybridAnalyzer.scala:90) at org.apache.spark.sql.catalyst.analysis.Analyzer.$anonfun$executeAndCheck$2(Analyzer.scala:686) at org.apache.spark.sql.catalyst.plans.logical.AnalysisHelper$.markInAnalyzer(AnalysisHelper.scala:425) at org.apache.spark.sql.catalyst.analysis.Analyzer.$anonfun$executeAndCheck$1(Analyzer.scala:686) at com.databricks.sql.unity.SAMSnapshotHelper$.visitPlansDuringAnalysis(SAMSnapshotHelper.scala:40) at org.apache.spark.sql.catalyst.analysis.Analyzer.executeAndCheck(Analyzer.scala:678) at org.apache.spark.sql.execution.QueryExecution.$anonfun$lazyAnalyzed$3(QueryExecution.scala:487) at com.databricks.spark.util.FrameProfiler$.$anonfun$record$1(FrameProfiler.scala:114) at com.databricks.spark.util.FrameProfilerExporter$.maybeExportFrameProfiler(FrameProfilerExporter.scala:200) at com.databricks.spark.util.FrameProfiler$.record(FrameProfiler.scala:105) at org.apache.spark.sql.catalyst.QueryPlanningTracker.measurePhase(QueryPlanningTracker.scala:812) at org.apache.spark.sql.execution.QueryExecution.$anonfun$executePhase$8(QueryExecution.scala:1001) at org.apache.spark.sql.execution.SQLExecution$.$anonfun$withExecutionPhase$1(SQLExecution.scala:167) at com.databricks.logging.AttributionContext$.$anonfun$withValue$1(AttributionContext.scala:348) at scala.util.DynamicVariable.withValue(DynamicVariable.scala:59) at com.databricks.logging.AttributionContext$.withValue(AttributionContext.scala:344) at com.databricks.util.TracingSpanUtils$.$anonfun$withTracing$4(TracingSpanUtils.scala:246) at com.databricks.util.TracingSpanUtils$.withTracing(TracingSpanUtils.scala:130) at com.databricks.util.TracingSpanUtils$.withTracing(TracingSpanUtils.scala:244) at com.databricks.spark.util.DatabricksTracingHelper.withSpan(DatabricksSparkTracingHelper.scala:154) at com.databricks.spark.util.DBRTracing$.withSpan(DBRTracing.scala:87) at org.apache.spark.sql.execution.SQLExecution$.withExecutionPhase(SQLExecution.scala:148) at org.apache.spark.sql.execution.QueryExecution.$anonfun$executePhase$7(QueryExecution.scala:1001) at org.apache.spark.sql.execution.QueryExecution$.withInternalError(QueryExecution.scala:1660) at org.apache.spark.sql.execution.QueryExecution.$anonfun$executePhase$5(QueryExecution.scala:994) at com.databricks.util.LexicalThreadLocal$Handle.runWith(LexicalThreadLocal.scala:63) at org.apache.spark.sql.execution.QueryExecution.$anonfun$executePhase$4(QueryExecution.scala:991) at com.databricks.util.LexicalThreadLocal$Handle.runWith(LexicalThreadLocal.scala:63) at org.apache.spark.sql.execution.QueryExecution.$anonfun$executePhase$3(QueryExecution.scala:991) at com.databricks.util.LexicalThreadLocal$Handle.runWith(LexicalThreadLocal.scala:63) at org.apache.spark.sql.execution.QueryExecution.$anonfun$executePhase$2(QueryExecution.scala:990) at com.databricks.util.LexicalThreadLocal$Handle.runWith(LexicalThreadLocal.scala:63) at org.apache.spark.sql.execution.QueryExecution.withQueryExecutionId(QueryExecution.scala:978) at org.apache.spark.sql.execution.QueryExecution.$anonfun$executePhase$1(QueryExecution.scala:989) at org.apache.spark.sql.SparkSession.withActive(SparkSession.scala:860) at org.apache.spark.sql.execution.QueryExecution.executePhase(QueryExecution.scala:988) at org.apache.spark.sql.execution.QueryExecution.$anonfun$lazyAnalyzed$2(QueryExecution.scala:478) at com.databricks.sql.util.MemoryTrackerHelper.withMemoryTracking(MemoryTrackerHelper.scala:111) at org.apache.spark.sql.execution.QueryExecution.$anonfun$lazyAnalyzed$1(QueryExecution.scala:477) at scala.util.Try$.apply(Try.scala:217) at org.apache.spark.util.Utils$.doTryWithCallerStacktrace(Utils.scala:1685) at org.apache.spark.util.Utils$.getTryWithCallerStacktrace(Utils.scala:1746) at org.apache.spark.util.LazyTry.get(LazyTry.scala:75) at org.apache.spark.sql.execution.QueryExecution.analyzed(QueryExecution.scala:516) at org.mlflow.spark.autologging.DatasourceAttributeExtractorBase.getTableInfos(DatasourceAttributeExtractor.scala:88) at org.mlflow.spark.autologging.DatasourceAttributeExtractorBase.getTableInfos$(DatasourceAttributeExtractor.scala:85) at org.mlflow.spark.autologging.ReplAwareDatasourceAttributeExtractor$.getTableInfos(DatasourceAttributeExtractor.scala:144) at org.mlflow.spark.autologging.ReplAwareSparkDataSourceListener.onSQLExecutionEnd(ReplAwareSparkDataSourceListener.scala:49) at org.mlflow.spark.autologging.SparkDataSourceListener.$anonfun$onOtherEvent$1(SparkDataSourceListener.scala:39) at scala.runtime.java8.JFunction0$mcV$sp.apply(JFunction0$mcV$sp.scala:18) at org.mlflow.spark.autologging.ExceptionUtils$.tryAndLogUnexpectedError(ExceptionUtils.scala:26) at org.mlflow.spark.autologging.SparkDataSourceListener.onOtherEvent(SparkDataSourceListener.scala:39) at org.apache.spark.scheduler.SparkListenerBus.doPostEvent(SparkListenerBus.scala:108) at org.apache.spark.scheduler.SparkListenerBus.doPostEvent$(SparkListenerBus.scala:28) at org.apache.spark.scheduler.AsyncEventQueue.doPostEvent(AsyncEventQueue.scala:46) at org.apache.spark.scheduler.AsyncEventQueue.doPostEvent(AsyncEventQueue.scala:46) at org.apache.spark.util.ListenerBus.postToAll(ListenerBus.scala:208) at org.apache.spark.util.ListenerBus.postToAll$(ListenerBus.scala:172) at org.apache.spark.scheduler.AsyncEventQueue.super$postToAll(AsyncEventQueue.scala:183) at org.apache.spark.scheduler.AsyncEventQueue.$anonfun$dispatch$1(AsyncEventQueue.scala:183) at scala.runtime.java8.JFunction0$mcJ$sp.apply(JFunction0$mcJ$sp.scala:17) at scala.util.DynamicVariable.withValue(DynamicVariable.scala:59) at org.apache.spark.scheduler.AsyncEventQueue.org$apache$spark$scheduler$AsyncEventQueue$$dispatch(AsyncEventQueue.scala:119) at org.apache.spark.scheduler.AsyncEventQueue$$anon$2.$anonfun$run$1(AsyncEventQueue.scala:115) at org.apache.spark.util.Utils$.tryOrStopSparkContext(Utils.scala:1573) at org.apache.spark.scheduler.AsyncEventQueue$$anon$2.run(AsyncEventQueue.scala:115) Suppressed: org.apache.spark.util.Utils$OriginalTryStackTraceException: Full stacktrace of original doTryWithCallerStacktrace caller at oracle.jdbc.driver.T4CConnection.logon(T4CConnection.java:874) at oracle.jdbc.driver.PhysicalConnection.connect(PhysicalConnection.java:793) at oracle.jdbc.driver.T4CDriverExtension.getConnection(T4CDriverExtension.java:57) at oracle.jdbc.driver.OracleDriver.connect(OracleDriver.java:747) at oracle.jdbc.driver.OracleDriver.connect(OracleDriver.java:562) at org.apache.spark.sql.execution.datasources.jdbc.connection.BasicConnectionProvider.getConnection(BasicConnectionProvider.scala:50) at org.apache.spark.sql.execution.datasources.jdbc.connection.ConnectionProviderBase.create(ConnectionProvider.scala:102) at org.apache.spark.sql.jdbc.JdbcDialect.$anonfun$createConnectionFactory$2(JdbcDialects.scala:269) at org.apache.spark.sql.catalyst.MetricKeyUtils$.measure(MetricKey.scala:2259) at org.apache.spark.sql.jdbc.JdbcDialect.$anonfun$createConnectionFactory$1(JdbcDialects.scala:269) at org.apache.spark.sql.jdbc.JdbcDialect.$anonfun$createConnectionFactory$1$adapted(JdbcDialects.scala:265) at org.apache.spark.sql.execution.datasources.jdbc.JdbcUtils$.withConnection(JdbcUtils.scala:1383) at org.apache.spark.sql.execution.datasources.jdbc.JDBCRDD$.resolveTable(JDBCRDD.scala:90) at org.apache.spark.sql.execution.datasources.jdbc.JDBCRelation$.getSchema(JDBCRelation.scala:256) at org.apache.spark.sql.execution.datasources.jdbc.JdbcRelationProvider.$anonfun$createRelation$1(JdbcRelationProvider.scala:50) at org.apache.spark.sql.execution.metric.SQLMetrics$.withTimingNs(SQLMetrics.scala:493) at org.apache.spark.sql.execution.datasources.jdbc.JdbcRelationProvider.createRelation(JdbcRelationProvider.scala:50) at org.apache.spark.sql.execution.datasources.DataSource.resolveRelation(DataSource.scala:455) at org.apache.spark.sql.catalyst.analysis.ResolveDataSource.org$apache$spark$sql$catalyst$analysis$ResolveDataSource$$loadV1BatchSource(ResolveDataSource.scala:277) at org.apache.spark.sql.catalyst.analysis.ResolveDataSource$$anonfun$apply$1.$anonfun$applyOrElse$5(ResolveDataSource.scala:114) at scala.Option.getOrElse(Option.scala:201) at org.apache.spark.sql.catalyst.analysis.ResolveDataSource$$anonfun$apply$1.applyOrElse(ResolveDataSource.scala:113) at org.apache.spark.sql.catalyst.analysis.ResolveDataSource$$anonfun$apply$1.applyOrElse(ResolveDataSource.scala:60) at org.apache.spark.sql.catalyst.plans.logical.AnalysisHelper.$anonfun$resolveOperatorsUpWithPruning$3(AnalysisHelper.scala:141) at org.apache.spark.sql.catalyst.trees.CurrentOrigin$.withOrigin(origin.scala:142) at org.apache.spark.sql.catalyst.plans.logical.AnalysisHelper.$anonfun$resolveOperatorsUpWithPruning$1(AnalysisHelper.scala:141) at org.apache.spark.sql.catalyst.plans.logical.AnalysisHelper$.allowInvokingTransformsInAnalyzer(AnalysisHelper.scala:418) at org.apache.spark.sql.catalyst.plans.logical.AnalysisHelper.resolveOperatorsUpWithPruning(AnalysisHelper.scala:137) at org.apache.spark.sql.catalyst.plans.logical.AnalysisHelper.resolveOperatorsUpWithPruning$(AnalysisHelper.scala:133) at org.apache.spark.sql.catalyst.plans.logical.LogicalPlan.resolveOperatorsUpWithPruning(LogicalPlan.scala:47) at org.apache.spark.sql.catalyst.plans.logical.AnalysisHelper.resolveOperatorsUp(AnalysisHelper.scala:114) at org.apache.spark.sql.catalyst.plans.logical.AnalysisHelper.resolveOperatorsUp$(AnalysisHelper.scala:113) at org.apache.spark.sql.catalyst.plans.logical.LogicalPlan.resolveOperatorsUp(LogicalPlan.scala:47) at org.apache.spark.sql.catalyst.analysis.ResolveDataSource.apply(ResolveDataSource.scala:60) at org.apache.spark.sql.catalyst.analysis.ResolveDataSource.apply(ResolveDataSource.scala:58) at org.apache.spark.sql.catalyst.rules.RuleExecutor.$anonfun$execute$17(RuleExecutor.scala:509) at org.apache.spark.sql.catalyst.rules.RecoverableRuleExecutionHelper.processRule(RuleExecutor.scala:663) at org.apache.spark.sql.catalyst.rules.RecoverableRuleExecutionHelper.processRule$(RuleExecutor.scala:647) at org.apache.spark.sql.catalyst.rules.RuleExecutor.processRule(RuleExecutor.scala:143) at org.apache.spark.sql.catalyst.rules.RuleExecutor.$anonfun$execute$16(RuleExecutor.scala:509) at com.databricks.spark.util.MemoryTracker$.withThreadAllocatedBytes(MemoryTracker.scala:51) at org.apache.spark.sql.catalyst.QueryPlanningTracker$.measureRule(QueryPlanningTracker.scala:382) at org.apache.spark.sql.catalyst.rules.RuleExecutor.$anonfun$execute$15(RuleExecutor.scala:507) at com.databricks.spark.util.FrameProfiler$.$anonfun$record$1(FrameProfiler.scala:114) at com.databricks.spark.util.FrameProfilerExporter$.maybeExportFrameProfiler(FrameProfilerExporter.scala:200) at com.databricks.spark.util.FrameProfiler$.record(FrameProfiler.scala:105) at org.apache.spark.sql.catalyst.rules.RuleExecutor.$anonfun$execute$14(RuleExecutor.scala:506) at scala.collection.LinearSeqOps.foldLeft(LinearSeq.scala:183) at scala.collection.LinearSeqOps.foldLeft$(LinearSeq.scala:179) at scala.collection.immutable.List.foldLeft(List.scala:79) at org.apache.spark.sql.catalyst.rules.RuleExecutor.$anonfun$execute$13(RuleExecutor.scala:498) at scala.runtime.java8.JFunction0$mcV$sp.apply(JFunction0$mcV$sp.scala:18) at com.databricks.spark.util.FrameProfiler$.$anonfun$record$1(FrameProfiler.scala:114) at com.databricks.spark.util.FrameProfilerExporter$.maybeExportFrameProfiler(FrameProfilerExporter.scala:200) at com.databricks.spark.util.FrameProfiler$.record(FrameProfiler.scala:105) at org.apache.spark.sql.catalyst.rules.RuleExecutor.executeBatch$1(RuleExecutor.scala:472) at org.apache.spark.sql.catalyst.rules.RuleExecutor.$anonfun$execute$23(RuleExecutor.scala:619) at org.apache.spark.sql.catalyst.rules.RuleExecutor.$anonfun$execute$23$adapted(RuleExecutor.scala:619) at scala.collection.immutable.List.foreach(List.scala:334) at org.apache.spark.sql.catalyst.rules.RuleExecutor.$anonfun$execute$1(RuleExecutor.scala:619) at com.databricks.spark.util.FrameProfiler$.$anonfun$record$1(FrameProfiler.scala:114) at com.databricks.spark.util.FrameProfilerExporter$.maybeExportFrameProfiler(FrameProfilerExporter.scala:200) at com.databricks.spark.util.FrameProfiler$.record(FrameProfiler.scala:105) at org.apache.spark.sql.catalyst.rules.RuleExecutor.execute(RuleExecutor.scala:365) at org.apache.spark.sql.catalyst.analysis.Analyzer.super$execute(Analyzer.scala:734) at org.apache.spark.sql.catalyst.analysis.Analyzer.$anonfun$executeSameContext$1(Analyzer.scala:734) at com.databricks.sql.unity.SAMSnapshotHelper$.visitPlansDuringAnalysis(SAMSnapshotHelper.scala:40) at org.apache.spark.sql.catalyst.analysis.Analyzer.executeSameContext(Analyzer.scala:733) at org.apache.spark.sql.catalyst.analysis.Analyzer.$anonfun$execute$1(Analyzer.scala:706) at org.apache.spark.sql.catalyst.analysis.AnalysisContext$.withNewAnalysisContext(Analyzer.scala:477) at org.apache.spark.sql.catalyst.analysis.Analyzer.execute(Analyzer.scala:706) at org.apache.spark.sql.catalyst.analysis.Analyzer.execute(Analyzer.scala:629) at org.apache.spark.sql.catalyst.rules.RuleExecutor.$anonfun$executeAndTrack$1(RuleExecutor.scala:353) at org.apache.spark.sql.catalyst.QueryPlanningTracker$.withTracker(QueryPlanningTracker.scala:265) at org.apache.spark.sql.catalyst.rules.RuleExecutor.executeAndTrack(RuleExecutor.scala:353) at org.apache.spark.sql.catalyst.analysis.resolver.HybridAnalyzer.resolveInFixedPoint(HybridAnalyzer.scala:438) at org.apache.spark.sql.catalyst.analysis.resolver.HybridAnalyzer.$anonfun$apply$1(HybridAnalyzer.scala:97) at org.apache.spark.sql.catalyst.analysis.resolver.HybridAnalyzer.withTrackedAnalyzerBridgeState(HybridAnalyzer.scala:134) at org.apache.spark.sql.catalyst.analysis.resolver.HybridAnalyzer.apply(HybridAnalyzer.scala:90) at org.apache.spark.sql.catalyst.analysis.Analyzer.$anonfun$executeAndCheck$2(Analyzer.scala:686) at org.apache.spark.sql.catalyst.plans.logical.AnalysisHelper$.markInAnalyzer(AnalysisHelper.scala:425) at org.apache.spark.sql.catalyst.analysis.Analyzer.$anonfun$executeAndCheck$1(Analyzer.scala:686) at com.databricks.sql.unity.SAMSnapshotHelper$.visitPlansDuringAnalysis(SAMSnapshotHelper.scala:40) at org.apache.spark.sql.catalyst.analysis.Analyzer.executeAndCheck(Analyzer.scala:678) at org.apache.spark.sql.execution.QueryExecution.$anonfun$lazyAnalyzed$3(QueryExecution.scala:487) at com.databricks.spark.util.FrameProfiler$.$anonfun$record$1(FrameProfiler.scala:114) at com.databricks.spark.util.FrameProfilerExporter$.maybeExportFrameProfiler(FrameProfilerExporter.scala:200) at com.databricks.spark.util.FrameProfiler$.record(FrameProfiler.scala:105) at org.apache.spark.sql.catalyst.QueryPlanningTracker.measurePhase(QueryPlanningTracker.scala:812) at org.apache.spark.sql.execution.QueryExecution.$anonfun$executePhase$8(QueryExecution.scala:1001) at org.apache.spark.sql.execution.SQLExecution$.$anonfun$withExecutionPhase$1(SQLExecution.scala:167) at com.databricks.logging.AttributionContext$.$anonfun$withValue$1(AttributionContext.scala:348) at scala.util.DynamicVariable.withValue(DynamicVariable.scala:59) at com.databricks.logging.AttributionContext$.withValue(AttributionContext.scala:344) at com.databricks.util.TracingSpanUtils$.$anonfun$withTracing$4(TracingSpanUtils.scala:246) at com.databricks.util.TracingSpanUtils$.withTracing(TracingSpanUtils.scala:130) at com.databricks.util.TracingSpanUtils$.withTracing(TracingSpanUtils.scala:244) at com.databricks.spark.util.DatabricksTracingHelper.withSpan(DatabricksSparkTracingHelper.scala:154) at com.databricks.spark.util.DBRTracing$.withSpan(DBRTracing.scala:87) at org.apache.spark.sql.execution.SQLExecution$.withExecutionPhase(SQLExecution.scala:148) at org.apache.spark.sql.execution.QueryExecution.$anonfun$executePhase$7(QueryExecution.scala:1001) at org.apache.spark.sql.execution.QueryExecution$.withInternalError(QueryExecution.scala:1660) at org.apache.spark.sql.execution.QueryExecution.$anonfun$executePhase$5(QueryExecution.scala:994) at com.databricks.util.LexicalThreadLocal$Handle.runWith(LexicalThreadLocal.scala:63) at org.apache.spark.sql.execution.QueryExecution.$anonfun$executePhase$4(QueryExecution.scala:991) at com.databricks.util.LexicalThreadLocal$Handle.runWith(LexicalThreadLocal.scala:63) at org.apache.spark.sql.execution.QueryExecution.$anonfun$executePhase$3(QueryExecution.scala:991) at com.databricks.util.LexicalThreadLocal$Handle.runWith(LexicalThreadLocal.scala:63) at org.apache.spark.sql.execution.QueryExecution.$anonfun$executePhase$2(QueryExecution.scala:990) at com.databricks.util.LexicalThreadLocal$Handle.runWith(LexicalThreadLocal.scala:63) at org.apache.spark.sql.execution.QueryExecution.withQueryExecutionId(QueryExecution.scala:978) at org.apache.spark.sql.execution.QueryExecution.$anonfun$executePhase$1(QueryExecution.scala:989) at org.apache.spark.sql.SparkSession.withActive(SparkSession.scala:860) at org.apache.spark.sql.execution.QueryExecution.executePhase(QueryExecution.scala:988) at org.apache.spark.sql.execution.QueryExecution.$anonfun$lazyAnalyzed$2(QueryExecution.scala:478) at com.databricks.sql.util.MemoryTrackerHelper.withMemoryTracking(MemoryTrackerHelper.scala:111) at org.apache.spark.sql.execution.QueryExecution.$anonfun$lazyAnalyzed$1(QueryExecution.scala:477) at scala.util.Try$.apply(Try.scala:217) at org.apache.spark.util.Utils$.doTryWithCallerStacktrace(Utils.scala:1685) at org.apache.spark.util.LazyTry.tryT$lzycompute(LazyTry.scala:60) at org.apache.spark.util.LazyTry.tryT(LazyTry.scala:59) at org.apache.spark.util.LazyTry.get(LazyTry.scala:75) at org.apache.spark.sql.execution.QueryExecution.analyzed(QueryExecution.scala:516) at org.apache.spark.sql.execution.QueryExecution.assertAnalyzed(QueryExecution.scala:404) at org.apache.spark.sql.classic.Dataset$.$anonfun$ofRows$1(Dataset.scala:128) at org.apache.spark.sql.SparkSession.withActive(SparkSession.scala:860) at org.apache.spark.sql.classic.SparkSession.$anonfun$withActiveAndFrameProfiler$1(SparkSession.scala:1245) at com.databricks.spark.util.FrameProfiler$.$anonfun$record$1(FrameProfiler.scala:114) at com.databricks.spark.util.FrameProfilerExporter$.maybeExportFrameProfiler(FrameProfilerExporter.scala:200) at com.databricks.spark.util.FrameProfiler$.record(FrameProfiler.scala:105) at org.apache.spark.sql.classic.SparkSession.withActiveAndFrameProfiler(SparkSession.scala:1245) at org.apache.spark.sql.classic.Dataset$.ofRows(Dataset.scala:126) at org.apache.spark.sql.classic.DataFrameReader.load(DataFrameReader.scala:164) at org.apache.spark.sql.classic.DataFrameReader.load(DataFrameReader.scala:140) at java.base/jdk.internal.reflect.NativeMethodAccessorImpl.invoke0(Native Method) at java.base/jdk.internal.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:75) at java.base/jdk.internal.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:52) at java.base/java.lang.reflect.Method.invoke(Method.java:580) at py4j.reflection.MethodInvoker.invoke(MethodInvoker.java:244) at py4j.reflection.ReflectionEngine.invoke(ReflectionEngine.java:397) at py4j.Gateway.invoke(Gateway.java:306) at py4j.commands.AbstractCommand.invokeMethod(AbstractCommand.java:132) at py4j.commands.CallCommand.execute(CallCommand.java:79) at py4j.ClientServerConnection.waitForCommands(ClientServerConnection.java:197) at py4j.ClientServerConnection.run(ClientServerConnection.java:117) at java.base/java.lang.Thread.run(Thread.java:1583) Caused by: java.io.IOException: Socket read timed out, Authentication lapse 0 ms. at oracle.jdbc.driver.T4CConnection.logon(T4CConnection.java:870) at oracle.jdbc.driver.PhysicalConnection.connect(PhysicalConnection.java:793) at oracle.jdbc.driver.T4CDriverExtension.getConnection(T4CDriverExtension.java:57) at oracle.jdbc.driver.OracleDriver.connect(OracleDriver.java:747) at oracle.jdbc.driver.OracleDriver.connect(OracleDriver.java:562) at org.apache.spark.sql.execution.datasources.jdbc.connection.BasicConnectionProvider.getConnection(BasicConnectionProvider.scala:50) at org.apache.spark.sql.execution.datasources.jdbc.connection.ConnectionProviderBase.create(ConnectionProvider.scala:102) at org.apache.spark.sql.jdbc.JdbcDialect.$anonfun$createConnectionFactory$2(JdbcDialects.scala:269) at org.apache.spark.sql.catalyst.MetricKeyUtils$.measure(MetricKey.scala:2259) at org.apache.spark.sql.jdbc.JdbcDialect.$anonfun$createConnectionFactory$1(JdbcDialects.scala:269) at org.apache.spark.sql.jdbc.JdbcDialect.$anonfun$createConnectionFactory$1$adapted(JdbcDialects.scala:265) at org.apache.spark.sql.execution.datasources.jdbc.JdbcUtils$.withConnection(JdbcUtils.scala:1383) at org.apache.spark.sql.execution.datasources.jdbc.JDBCRDD$.resolveTable(JDBCRDD.scala:90) at org.apache.spark.sql.execution.datasources.jdbc.JDBCRelation$.getSchema(JDBCRelation.scala:256) at org.apache.spark.sql.execution.datasources.jdbc.JdbcRelationProvider.$anonfun$createRelation$1(JdbcRelationProvider.scala:50) at org.apache.spark.sql.execution.metric.SQLMetrics$.withTimingNs(SQLMetrics.scala:493) at org.apache.spark.sql.execution.datasources.jdbc.JdbcRelationProvider.createRelation(JdbcRelationProvider.scala:50) at org.apache.spark.sql.execution.datasources.DataSource.resolveRelation(DataSource.scala:455) at org.apache.spark.sql.catalyst.analysis.ResolveDataSource.org$apache$spark$sql$catalyst$analysis$ResolveDataSource$$loadV1BatchSource(ResolveDataSource.scala:277) at org.apache.spark.sql.catalyst.analysis.ResolveDataSource$$anonfun$apply$1.$anonfun$applyOrElse$5(ResolveDataSource.scala:114) at scala.Option.getOrElse(Option.scala:201) at org.apache.spark.sql.catalyst.analysis.ResolveDataSource$$anonfun$apply$1.applyOrElse(ResolveDataSource.scala:113) at org.apache.spark.sql.catalyst.analysis.ResolveDataSource$$anonfun$apply$1.applyOrElse(ResolveDataSource.scala:60) at org.apache.spark.sql.catalyst.plans.logical.AnalysisHelper.$anonfun$resolveOperatorsUpWithPruning$3(AnalysisHelper.scala:141) at org.apache.spark.sql.catalyst.trees.CurrentOrigin$.withOrigin(origin.scala:142) at org.apache.spark.sql.catalyst.plans.logical.AnalysisHelper.$anonfun$resolveOperatorsUpWithPruning$1(AnalysisHelper.scala:141) at org.apache.spark.sql.catalyst.plans.logical.AnalysisHelper$.allowInvokingTransformsInAnalyzer(AnalysisHelper.scala:418) at org.apache.spark.sql.catalyst.plans.logical.AnalysisHelper.resolveOperatorsUpWithPruning(AnalysisHelper.scala:137) at org.apache.spark.sql.catalyst.plans.logical.AnalysisHelper.resolveOperatorsUpWithPruning$(AnalysisHelper.scala:133) at org.apache.spark.sql.catalyst.plans.logical.LogicalPlan.resolveOperatorsUpWithPruning(LogicalPlan.scala:47) at org.apache.spark.sql.catalyst.plans.logical.AnalysisHelper.resolveOperatorsUp(AnalysisHelper.scala:114) at org.apache.spark.sql.catalyst.plans.logical.AnalysisHelper.resolveOperatorsUp$(AnalysisHelper.scala:113) at org.apache.spark.sql.catalyst.plans.logical.LogicalPlan.resolveOperatorsUp(LogicalPlan.scala:47) at org.apache.spark.sql.catalyst.analysis.ResolveDataSource.apply(ResolveDataSource.scala:60) at org.apache.spark.sql.catalyst.analysis.ResolveDataSource.apply(ResolveDataSource.scala:58) at org.apache.spark.sql.catalyst.rules.RuleExecutor.$anonfun$execute$17(RuleExecutor.scala:509) at org.apache.spark.sql.catalyst.rules.RecoverableRuleExecutionHelper.processRule(RuleExecutor.scala:663) at org.apache.spark.sql.catalyst.rules.RecoverableRuleExecutionHelper.processRule$(RuleExecutor.scala:647) at org.apache.spark.sql.catalyst.rules.RuleExecutor.processRule(RuleExecutor.scala:143) at org.apache.spark.sql.catalyst.rules.RuleExecutor.$anonfun$execute$16(RuleExecutor.scala:509) at com.databricks.spark.util.MemoryTracker$.withThreadAllocatedBytes(MemoryTracker.scala:51) at org.apache.spark.sql.catalyst.QueryPlanningTracker$.measureRule(QueryPlanningTracker.scala:382) at org.apache.spark.sql.catalyst.rules.RuleExecutor.$anonfun$execute$15(RuleExecutor.scala:507) at com.databricks.spark.util.FrameProfiler$.$anonfun$record$1(FrameProfiler.scala:114) at com.databricks.spark.util.FrameProfilerExporter$.maybeExportFrameProfiler(FrameProfilerExporter.scala:200) at com.databricks.spark.util.FrameProfiler$.record(FrameProfiler.scala:105) at org.apache.spark.sql.catalyst.rules.RuleExecutor.$anonfun$execute$14(RuleExecutor.scala:506) at scala.collection.LinearSeqOps.foldLeft(LinearSeq.scala:183) at scala.collection.LinearSeqOps.foldLeft$(LinearSeq.scala:179) at scala.collection.immutable.List.foldLeft(List.scala:79) at org.apache.spark.sql.catalyst.rules.RuleExecutor.$anonfun$execute$13(RuleExecutor.scala:498) at scala.runtime.java8.JFunction0$mcV$sp.apply(JFunction0$mcV$sp.scala:18) at com.databricks.spark.util.FrameProfiler$.$anonfun$record$1(FrameProfiler.scala:114) at com.databricks.spark.util.FrameProfilerExporter$.maybeExportFrameProfiler(FrameProfilerExporter.scala:200) at com.databricks.spark.util.FrameProfiler$.record(FrameProfiler.scala:105) at org.apache.spark.sql.catalyst.rules.RuleExecutor.executeBatch$1(RuleExecutor.scala:472) at org.apache.spark.sql.catalyst.rules.RuleExecutor.$anonfun$execute$23(RuleExecutor.scala:619) at org.apache.spark.sql.catalyst.rules.RuleExecutor.$anonfun$execute$23$adapted(RuleExecutor.scala:619) at scala.collection.immutable.List.foreach(List.scala:334) at org.apache.spark.sql.catalyst.rules.RuleExecutor.$anonfun$execute$1(RuleExecutor.scala:619) at com.databricks.spark.util.FrameProfiler$.$anonfun$record$1(FrameProfiler.scala:114) at com.databricks.spark.util.FrameProfilerExporter$.maybeExportFrameProfiler(FrameProfilerExporter.scala:200) at com.databricks.spark.util.FrameProfiler$.record(FrameProfiler.scala:105) at org.apache.spark.sql.catalyst.rules.RuleExecutor.execute(RuleExecutor.scala:365) at org.apache.spark.sql.catalyst.analysis.Analyzer.super$execute(Analyzer.scala:734) at org.apache.spark.sql.catalyst.analysis.Analyzer.$anonfun$executeSameContext$1(Analyzer.scala:734) at com.databricks.sql.unity.SAMSnapshotHelper$.visitPlansDuringAnalysis(SAMSnapshotHelper.scala:40) at org.apache.spark.sql.catalyst.analysis.Analyzer.executeSameContext(Analyzer.scala:733) at org.apache.spark.sql.catalyst.analysis.Analyzer.$anonfun$execute$1(Analyzer.scala:706) at org.apache.spark.sql.catalyst.analysis.AnalysisContext$.withNewAnalysisContext(Analyzer.scala:477) at org.apache.spark.sql.catalyst.analysis.Analyzer.execute(Analyzer.scala:706) at org.apache.spark.sql.catalyst.analysis.Analyzer.execute(Analyzer.scala:629) at org.apache.spark.sql.catalyst.rules.RuleExecutor.$anonfun$executeAndTrack$1(RuleExecutor.scala:353) at org.apache.spark.sql.catalyst.QueryPlanningTracker$.withTracker(QueryPlanningTracker.scala:265) at org.apache.spark.sql.catalyst.rules.RuleExecutor.executeAndTrack(RuleExecutor.scala:353) at org.apache.spark.sql.catalyst.analysis.resolver.HybridAnalyzer.resolveInFixedPoint(HybridAnalyzer.scala:438) at org.apache.spark.sql.catalyst.analysis.resolver.HybridAnalyzer.$anonfun$apply$1(HybridAnalyzer.scala:97) at org.apache.spark.sql.catalyst.analysis.resolver.HybridAnalyzer.withTrackedAnalyzerBridgeState(HybridAnalyzer.scala:134) at org.apache.spark.sql.catalyst.analysis.resolver.HybridAnalyzer.apply(HybridAnalyzer.scala:90) at org.apache.spark.sql.catalyst.analysis.Analyzer.$anonfun$executeAndCheck$2(Analyzer.scala:686) at org.apache.spark.sql.catalyst.plans.logical.AnalysisHelper$.markInAnalyzer(AnalysisHelper.scala:425) at org.apache.spark.sql.catalyst.analysis.Analyzer.$anonfun$executeAndCheck$1(Analyzer.scala:686) at com.databricks.sql.unity.SAMSnapshotHelper$.visitPlansDuringAnalysis(SAMSnapshotHelper.scala:40) at org.apache.spark.sql.catalyst.analysis.Analyzer.executeAndCheck(Analyzer.scala:678) at org.apache.spark.sql.execution.QueryExecution.$anonfun$lazyAnalyzed$3(QueryExecution.scala:487) at com.databricks.spark.util.FrameProfiler$.$anonfun$record$1(FrameProfiler.scala:114) at com.databricks.spark.util.FrameProfilerExporter$.maybeExportFrameProfiler(FrameProfilerExporter.scala:200) at com.databricks.spark.util.FrameProfiler$.record(FrameProfiler.scala:105) at org.apache.spark.sql.catalyst.QueryPlanningTracker.measurePhase(QueryPlanningTracker.scala:812) at org.apache.spark.sql.execution.QueryExecution.$anonfun$executePhase$8(QueryExecution.scala:1001) at org.apache.spark.sql.execution.SQLExecution$.$anonfun$withExecutionPhase$1(SQLExecution.scala:167) at com.databricks.logging.AttributionContext$.$anonfun$withValue$1(AttributionContext.scala:348) at scala.util.DynamicVariable.withValue(DynamicVariable.scala:59) at com.databricks.logging.AttributionContext$.withValue(AttributionContext.scala:344) at com.databricks.util.TracingSpanUtils$.$anonfun$withTracing$4(TracingSpanUtils.scala:246) at com.databricks.util.TracingSpanUtils$.withTracing(TracingSpanUtils.scala:130) at com.databricks.util.TracingSpanUtils$.withTracing(TracingSpanUtils.scala:244) at com.databricks.spark.util.DatabricksTracingHelper.withSpan(DatabricksSparkTracingHelper.scala:154) at com.databricks.spark.util.DBRTracing$.withSpan(DBRTracing.scala:87) at org.apache.spark.sql.execution.SQLExecution$.withExecutionPhase(SQLExecution.scala:148) at org.apache.spark.sql.execution.QueryExecution.$anonfun$executePhase$7(QueryExecution.scala:1001) at org.apache.spark.sql.execution.QueryExecution$.withInternalError(QueryExecution.scala:1660) at org.apache.spark.sql.execution.QueryExecution.$anonfun$executePhase$5(QueryExecution.scala:994) at com.databricks.util.LexicalThreadLocal$Handle.runWith(LexicalThreadLocal.scala:63) at org.apache.spark.sql.execution.QueryExecution.$anonfun$executePhase$4(QueryExecution.scala:991) at com.databricks.util.LexicalThreadLocal$Handle.runWith(LexicalThreadLocal.scala:63) at org.apache.spark.sql.execution.QueryExecution.$anonfun$executePhase$3(QueryExecution.scala:991) at com.databricks.util.LexicalThreadLocal$Handle.runWith(LexicalThreadLocal.scala:63) at org.apache.spark.sql.execution.QueryExecution.$anonfun$executePhase$2(QueryExecution.scala:990) at com.databricks.util.LexicalThreadLocal$Handle.runWith(LexicalThreadLocal.scala:63) at org.apache.spark.sql.execution.QueryExecution.withQueryExecutionId(QueryExecution.scala:978) at org.apache.spark.sql.execution.QueryExecution.$anonfun$executePhase$1(QueryExecution.scala:989) at org.apache.spark.sql.SparkSession.withActive(SparkSession.scala:860) at org.apache.spark.sql.execution.QueryExecution.executePhase(QueryExecution.scala:988) at org.apache.spark.sql.execution.QueryExecution.$anonfun$lazyAnalyzed$2(QueryExecution.scala:478) at com.databricks.sql.util.MemoryTrackerHelper.withMemoryTracking(MemoryTrackerHelper.scala:111) at org.apache.spark.sql.execution.QueryExecution.$anonfun$lazyAnalyzed$1(QueryExecution.scala:477) at scala.util.Try$.apply(Try.scala:217) at org.apache.spark.util.Utils$.doTryWithCallerStacktrace(Utils.scala:1685) at org.apache.spark.util.LazyTry.tryT$lzycompute(LazyTry.scala:60) at org.apache.spark.util.LazyTry.tryT(LazyTry.scala:59) at org.apache.spark.util.LazyTry.get(LazyTry.scala:75) at org.apache.spark.sql.execution.QueryExecution.analyzed(QueryExecution.scala:516) at org.apache.spark.sql.execution.QueryExecution.assertAnalyzed(QueryExecution.scala:404) at org.apache.spark.sql.classic.Dataset$.$anonfun$ofRows$1(Dataset.scala:128) at org.apache.spark.sql.SparkSession.withActive(SparkSession.scala:860) at org.apache.spark.sql.classic.SparkSession.$anonfun$withActiveAndFrameProfiler$1(SparkSession.scala:1245) at com.databricks.spark.util.FrameProfiler$.$anonfun$record$1(FrameProfiler.scala:114) at com.databricks.spark.util.FrameProfilerExporter$.maybeExportFrameProfiler(FrameProfilerExporter.scala:200) at com.databricks.spark.util.FrameProfiler$.record(FrameProfiler.scala:105) at org.apache.spark.sql.classic.SparkSession.withActiveAndFrameProfiler(SparkSession.scala:1245) at org.apache.spark.sql.classic.Dataset$.ofRows(Dataset.scala:126) at org.apache.spark.sql.classic.DataFrameReader.load(DataFrameReader.scala:164) at org.apache.spark.sql.classic.DataFrameReader.load(DataFrameReader.scala:140) at java.base/jdk.internal.reflect.NativeMethodAccessorImpl.invoke0(Native Method) at java.base/jdk.internal.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:75) at java.base/jdk.internal.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:52) at java.base/java.lang.reflect.Method.invoke(Method.java:580) at py4j.reflection.MethodInvoker.invoke(MethodInvoker.java:244) at py4j.reflection.ReflectionEngine.invoke(ReflectionEngine.java:397) at py4j.Gateway.invoke(Gateway.java:306) at py4j.commands.AbstractCommand.invokeMethod(AbstractCommand.java:132) at py4j.commands.CallCommand.execute(CallCommand.java:79) at py4j.ClientServerConnection.waitForCommands(ClientServerConnection.java:197) at py4j.ClientServerConnection.run(ClientServerConnection.java:117) at java.base/java.lang.Thread.run(Thread.java:1583) Caused by: oracle.net.nt.TimeoutInterruptHandler$IOReadTimeoutException: Socket read timed out at oracle.net.nt.TimeoutSocketChannel.handleInterrupt(TimeoutSocketChannel.java:254) at oracle.net.nt.TimeoutSocketChannel.read(TimeoutSocketChannel.java:180) at oracle.net.ns.NSProtocolNIO.doSocketRead(NSProtocolNIO.java:555) at oracle.net.ns.NIOPacket.readNIOPacket(NIOPacket.java:403) at oracle.net.ns.NSProtocolNIO.negotiateConnection(NSProtocolNIO.java:127) at oracle.net.ns.NSProtocol.connect(NSProtocol.java:340) at oracle.jdbc.driver.T4CConnection.connect(T4CConnection.java:1596) at oracle.jdbc.driver.T4CConnection.logon(T4CConnection.java:588) ... 145 more&lt;/DIV&gt;&lt;/DIV&gt;&lt;DIV&gt;&lt;DIV class=""&gt;&lt;HR /&gt;&lt;/DIV&gt;&lt;/DIV&gt;</description>
      <pubDate>Tue, 03 Feb 2026 20:21:25 GMT</pubDate>
      <guid>https://community.databricks.com/t5/data-engineering/getting-quot-socket-closed-quot-with-query-federation-to-oracle/m-p/146756#M52700</guid>
      <dc:creator>luketl2</dc:creator>
      <dc:date>2026-02-03T20:21:25Z</dc:date>
    </item>
    <item>
      <title>Re: Getting "socket closed" with query federation to Oracle DB on Amazon RDS</title>
      <link>https://community.databricks.com/t5/data-engineering/getting-quot-socket-closed-quot-with-query-federation-to-oracle/m-p/150156#M53276</link>
      <description>&lt;P&gt;Hi &lt;a href="https://community.databricks.com/t5/user/viewprofilepage/user-id/169882"&gt;@luketl2&lt;/a&gt;,&lt;/P&gt;
&lt;P&gt;Based on your description and the stack trace you shared, the root cause is almost certainly an NNE (Native Network Encryption) algorithm mismatch between the Oracle JDBC thin driver that Databricks uses and your Oracle RDS server-side configuration. The key clue is this line in your stack trace:&lt;/P&gt;
&lt;P&gt;Authentication lapse 0 ms&lt;/P&gt;
&lt;P&gt;This means the TCP handshake completed successfully (which you already confirmed with nc -vz), the JDBC driver initiated the Oracle Net session, but encryption/integrity algorithm negotiation failed during authentication. Oracle then drops the socket before the authentication exchange finishes.&lt;/P&gt;
&lt;P&gt;Here is a systematic approach to diagnose and resolve this.&lt;/P&gt;
&lt;P&gt;&lt;BR /&gt;STEP 1: VERIFY THE SERVER-SIDE NNE ALGORITHM LIST&lt;/P&gt;
&lt;P&gt;On your Oracle RDS instance, check which encryption and integrity (checksum) algorithms are configured. You can query this from a privileged session:&lt;/P&gt;
&lt;P&gt;SELECT name, value&lt;BR /&gt;FROM v$parameter&lt;BR /&gt;WHERE name IN (&lt;BR /&gt;'sqlnet.encryption_types_server',&lt;BR /&gt;'sqlnet.encryption_server',&lt;BR /&gt;'sqlnet.crypto_checksum_types_server',&lt;BR /&gt;'sqlnet.crypto_checksum_server'&lt;BR /&gt;);&lt;/P&gt;
&lt;P&gt;Or if you have access to the RDS parameter group, look for these parameters:&lt;/P&gt;
&lt;P&gt;sqlnet.encryption_server = REQUESTED (you confirmed this)&lt;BR /&gt;sqlnet.encryption_types_server = (list of algorithms)&lt;BR /&gt;sqlnet.crypto_checksum_server = REQUESTED or ACCEPTED&lt;BR /&gt;sqlnet.crypto_checksum_types_server = (list of algorithms)&lt;/P&gt;
&lt;P&gt;The Databricks JDBC thin driver supports a specific set of NNE algorithms. If your server is configured with algorithms the thin driver does not support (for example, older algorithms like DES or 3DES112 only, or newer ones not yet in the driver), negotiation will fail silently and Oracle closes the socket.&lt;/P&gt;
&lt;P&gt;&lt;BR /&gt;STEP 2: ALIGN ENCRYPTION ALGORITHMS&lt;/P&gt;
&lt;P&gt;The Oracle JDBC thin driver typically supports these NNE encryption algorithms:&lt;/P&gt;
&lt;P&gt;AES256&lt;BR /&gt;AES192&lt;BR /&gt;AES128&lt;BR /&gt;3DES168&lt;/P&gt;
&lt;P&gt;And these integrity/checksum algorithms:&lt;/P&gt;
&lt;P&gt;SHA256&lt;BR /&gt;SHA384&lt;BR /&gt;SHA512&lt;BR /&gt;SHA1&lt;/P&gt;
&lt;P&gt;Make sure your RDS parameter group includes at least one algorithm from each list above. For example, in your RDS custom parameter group, set:&lt;/P&gt;
&lt;P&gt;sqlnet.encryption_types_server = AES256,AES192,AES128&lt;BR /&gt;sqlnet.crypto_checksum_types_server = SHA256,SHA1&lt;/P&gt;
&lt;P&gt;After changing RDS parameter group values, you may need to reboot the RDS instance for the changes to take effect (depending on whether the parameter is static or dynamic).&lt;/P&gt;
&lt;P&gt;&lt;BR /&gt;STEP 3: CHECK THE CRYPTO_CHECKSUM SETTING&lt;/P&gt;
&lt;P&gt;A commonly overlooked cause of "socket closed" during NNE negotiation is the integrity/checksum configuration. Even if encryption negotiation succeeds, if the checksum algorithms do not match, Oracle will close the connection. Make sure:&lt;/P&gt;
&lt;P&gt;sqlnet.crypto_checksum_server = REQUESTED (or ACCEPTED)&lt;BR /&gt;sqlnet.crypto_checksum_types_server includes at least SHA256 or SHA1&lt;/P&gt;
&lt;P&gt;&lt;BR /&gt;STEP 4: TEST WITH EXPLICIT JDBC PROPERTIES (OPTIONAL DIAGNOSTIC)&lt;/P&gt;
&lt;P&gt;Since you already tried using the ojdbc driver from Maven directly, you can further diagnose by setting explicit NNE properties on the JDBC connection. In a notebook on your cluster, try:&lt;/P&gt;
&lt;P&gt;jdbc_url = "jdbc:oracle:thin:@//YOUR_HOST:1521/YOUR_SERVICE_NAME"&lt;/P&gt;
&lt;P&gt;connection_properties = {&lt;BR /&gt;"user": "YOUR_USER",&lt;BR /&gt;"password": "YOUR_PASSWORD",&lt;BR /&gt;"oracle.net.encryption_client": "REQUESTED",&lt;BR /&gt;"oracle.net.encryption_types_client": "(AES256)",&lt;BR /&gt;"oracle.net.crypto_checksum_client": "REQUESTED",&lt;BR /&gt;"oracle.net.crypto_checksum_types_client": "(SHA256)",&lt;BR /&gt;"oracle.jdbc.timezoneAsRegion": "false"&lt;BR /&gt;}&lt;/P&gt;
&lt;P&gt;df = spark.read.format("jdbc") \&lt;BR /&gt;.option("url", jdbc_url) \&lt;BR /&gt;.option("dbtable", "(SELECT 1 FROM DUAL)") \&lt;BR /&gt;.options(**connection_properties) \&lt;BR /&gt;.load()&lt;/P&gt;
&lt;P&gt;df.show()&lt;/P&gt;
&lt;P&gt;If this works, the issue is confirmed as an algorithm mismatch. If it still fails, try changing the encryption_types_client and crypto_checksum_types_client values to match what your server supports.&lt;/P&gt;
&lt;P&gt;&lt;BR /&gt;STEP 5: CHECK FOR ORACLE RDS-SPECIFIC CONSIDERATIONS&lt;/P&gt;
&lt;P&gt;Amazon RDS for Oracle has specific behavior around NNE:&lt;/P&gt;
&lt;P&gt;1. NNE configuration is done through RDS option groups (not sqlnet.ora directly). Make sure the NATIVE_NETWORK_ENCRYPTION option is added to your RDS option group with compatible algorithms.&lt;/P&gt;
&lt;P&gt;2. In the RDS option group, look for these settings under NATIVE_NETWORK_ENCRYPTION:&lt;BR /&gt;- SQLNET.ENCRYPTION_SERVER&lt;BR /&gt;- SQLNET.ENCRYPTION_TYPES_SERVER&lt;BR /&gt;- SQLNET.CRYPTO_CHECKSUM_SERVER&lt;BR /&gt;- SQLNET.CRYPTO_CHECKSUM_TYPES_SERVER&lt;/P&gt;
&lt;P&gt;3. RDS documentation for NNE configuration:&lt;BR /&gt;&lt;A href="https://docs.aws.amazon.com/AmazonRDS/latest/UserGuide/Appendix.Oracle.Options.NetworkEncryption.html" target="_blank"&gt;https://docs.aws.amazon.com/AmazonRDS/latest/UserGuide/Appendix.Oracle.Options.NetworkEncryption.html&lt;/A&gt;&lt;/P&gt;
&lt;P&gt;&lt;BR /&gt;STEP 6: ENABLE JDBC DRIVER TRACING (ADDITIONAL DIAGNOSTICS)&lt;/P&gt;
&lt;P&gt;For more detail on what is happening during the connection attempt, you can enable Oracle JDBC thin driver tracing on your Databricks cluster. Add these Spark config properties to your cluster configuration:&lt;/P&gt;
&lt;P&gt;spark.driver.extraJavaOptions -Doracle.jdbc.Trace=true -Djava.util.logging.config.file=/tmp/ojdbc_logging.properties&lt;/P&gt;
&lt;P&gt;Create the logging properties file in an init script:&lt;/P&gt;
&lt;P&gt;cat &amp;gt; /tmp/ojdbc_logging.properties &amp;lt;&amp;lt; 'EOF'&lt;BR /&gt;.level=FINE&lt;BR /&gt;handlers=java.util.logging.FileHandler&lt;BR /&gt;java.util.logging.FileHandler.pattern=/tmp/ojdbc_trace.log&lt;BR /&gt;java.util.logging.FileHandler.limit=50000000&lt;BR /&gt;java.util.logging.FileHandler.count=1&lt;BR /&gt;java.util.logging.FileHandler.formatter=java.util.logging.SimpleFormatter&lt;BR /&gt;EOF&lt;/P&gt;
&lt;P&gt;After the failed connection attempt, check /tmp/ojdbc_trace.log on the driver for detailed NNE negotiation messages.&lt;/P&gt;
&lt;P&gt;&lt;BR /&gt;ADDITIONAL NOTES&lt;/P&gt;
&lt;P&gt;- The Databricks documentation confirms that Lakehouse Federation uses NNE (not TLS) for non-Oracle Cloud connections: &lt;A href="https://docs.databricks.com/aws/en/query-federation/oracle" target="_blank"&gt;https://docs.databricks.com/aws/en/query-federation/oracle&lt;/A&gt;&lt;/P&gt;
&lt;P&gt;- Make sure your Oracle RDS security group allows inbound traffic on port 1521 from the Databricks stable egress IPs (which you mentioned you have already done).&lt;/P&gt;
&lt;P&gt;- If you are using a SQL warehouse (serverless), make sure you have configured a Network Connectivity Configuration (NCC) and that the stable NAT IPs are whitelisted on the RDS side: &lt;A href="https://docs.databricks.com/aws/en/security/network/serverless-network-security/serverless-firewall.html" target="_blank"&gt;https://docs.databricks.com/aws/en/security/network/serverless-network-security/serverless-firewall.html&lt;/A&gt;&lt;/P&gt;
&lt;P&gt;- The "Socket read timed out" variant you saw with the Maven ojdbc driver points to the same root cause: the driver is waiting for the server to respond to its authentication/encryption handshake, but the server has already closed the socket due to algorithm negotiation failure.&lt;/P&gt;
&lt;P&gt;If you have tried all the above and are still seeing the issue, I would recommend opening a Databricks support ticket. The support team can inspect the serverless/cluster-side connection logs at a deeper level and help identify the exact negotiation failure point.&lt;/P&gt;
&lt;P&gt;* This reply used an agent system I built to research and draft this response based on the wide set of documentation I have available and previous memory. I personally review the draft for any obvious issues and for monitoring system reliability and update it when I detect any drift, but there is still a small chance that something is inaccurate, especially if you are experimenting with brand new features.&lt;/P&gt;</description>
      <pubDate>Sun, 08 Mar 2026 05:36:54 GMT</pubDate>
      <guid>https://community.databricks.com/t5/data-engineering/getting-quot-socket-closed-quot-with-query-federation-to-oracle/m-p/150156#M53276</guid>
      <dc:creator>SteveOstrowski</dc:creator>
      <dc:date>2026-03-08T05:36:54Z</dc:date>
    </item>
    <item>
      <title>Re: Getting "socket closed" with query federation to Oracle DB on Amazon RDS</title>
      <link>https://community.databricks.com/t5/data-engineering/getting-quot-socket-closed-quot-with-query-federation-to-oracle/m-p/151296#M53621</link>
      <description>&lt;P&gt;Accepting as answer because it was related to firewall blocking port 1521. After eliminating all the other options we dug deeper on the networking side and found the issue. Thanks!&lt;/P&gt;</description>
      <pubDate>Wed, 18 Mar 2026 15:26:41 GMT</pubDate>
      <guid>https://community.databricks.com/t5/data-engineering/getting-quot-socket-closed-quot-with-query-federation-to-oracle/m-p/151296#M53621</guid>
      <dc:creator>luketl2</dc:creator>
      <dc:date>2026-03-18T15:26:41Z</dc:date>
    </item>
  </channel>
</rss>

