cancel
Showing results forย 
Search instead forย 
Did you mean:ย 
Data Engineering
Join discussions on data engineering best practices, architectures, and optimization strategies within the Databricks Community. Exchange insights and solutions with fellow data engineers.
cancel
Showing results forย 
Search instead forย 
Did you mean:ย 

Getting "socket closed" with query federation to Oracle DB on Amazon RDS

luketl2
Contributor

I am following this guide to connect to an Oracle DB in Amazon RDS: https://docs.databricks.com/aws/en/query-federation/oracle. I've created the connection, but when I go to test it, it loads for a while and then says "socket closed". From my understanding, this implies that the TCP session establishes (networking is fine) but immediately is closed by Oracle DB for whatever reason. I've gone through all the possibilities: we are using NNE, we checked and server-side NNE is at "REQUESTED" level (should be higher than "ACCEPTED"), we have stable egress IP and it is whitelisted on server side, and we have double-checked the service name. Not sure what else to do. Is there any way to get more info on the Databricks side? I tried checking driver logs but I did not see anything useful. Thanks

1 ACCEPTED SOLUTION

Accepted Solutions

SteveOstrowski
Databricks Employee
Databricks Employee

Hi @luketl2,

Based on your description and the stack trace you shared, the root cause is almost certainly an NNE (Native Network Encryption) algorithm mismatch between the Oracle JDBC thin driver that Databricks uses and your Oracle RDS server-side configuration. The key clue is this line in your stack trace:

Authentication lapse 0 ms

This means the TCP handshake completed successfully (which you already confirmed with nc -vz), the JDBC driver initiated the Oracle Net session, but encryption/integrity algorithm negotiation failed during authentication. Oracle then drops the socket before the authentication exchange finishes.

Here is a systematic approach to diagnose and resolve this.


STEP 1: VERIFY THE SERVER-SIDE NNE ALGORITHM LIST

On your Oracle RDS instance, check which encryption and integrity (checksum) algorithms are configured. You can query this from a privileged session:

SELECT name, value
FROM v$parameter
WHERE name IN (
'sqlnet.encryption_types_server',
'sqlnet.encryption_server',
'sqlnet.crypto_checksum_types_server',
'sqlnet.crypto_checksum_server'
);

Or if you have access to the RDS parameter group, look for these parameters:

sqlnet.encryption_server = REQUESTED (you confirmed this)
sqlnet.encryption_types_server = (list of algorithms)
sqlnet.crypto_checksum_server = REQUESTED or ACCEPTED
sqlnet.crypto_checksum_types_server = (list of algorithms)

The Databricks JDBC thin driver supports a specific set of NNE algorithms. If your server is configured with algorithms the thin driver does not support (for example, older algorithms like DES or 3DES112 only, or newer ones not yet in the driver), negotiation will fail silently and Oracle closes the socket.


STEP 2: ALIGN ENCRYPTION ALGORITHMS

The Oracle JDBC thin driver typically supports these NNE encryption algorithms:

AES256
AES192
AES128
3DES168

And these integrity/checksum algorithms:

SHA256
SHA384
SHA512
SHA1

Make sure your RDS parameter group includes at least one algorithm from each list above. For example, in your RDS custom parameter group, set:

sqlnet.encryption_types_server = AES256,AES192,AES128
sqlnet.crypto_checksum_types_server = SHA256,SHA1

After changing RDS parameter group values, you may need to reboot the RDS instance for the changes to take effect (depending on whether the parameter is static or dynamic).


STEP 3: CHECK THE CRYPTO_CHECKSUM SETTING

A commonly overlooked cause of "socket closed" during NNE negotiation is the integrity/checksum configuration. Even if encryption negotiation succeeds, if the checksum algorithms do not match, Oracle will close the connection. Make sure:

sqlnet.crypto_checksum_server = REQUESTED (or ACCEPTED)
sqlnet.crypto_checksum_types_server includes at least SHA256 or SHA1


STEP 4: TEST WITH EXPLICIT JDBC PROPERTIES (OPTIONAL DIAGNOSTIC)

Since you already tried using the ojdbc driver from Maven directly, you can further diagnose by setting explicit NNE properties on the JDBC connection. In a notebook on your cluster, try:

jdbc_url = "jdbc:oracle:thin:@//YOUR_HOST:1521/YOUR_SERVICE_NAME"

connection_properties = {
"user": "YOUR_USER",
"password": "YOUR_PASSWORD",
"oracle.net.encryption_client": "REQUESTED",
"oracle.net.encryption_types_client": "(AES256)",
"oracle.net.crypto_checksum_client": "REQUESTED",
"oracle.net.crypto_checksum_types_client": "(SHA256)",
"oracle.jdbc.timezoneAsRegion": "false"
}

df = spark.read.format("jdbc") \
.option("url", jdbc_url) \
.option("dbtable", "(SELECT 1 FROM DUAL)") \
.options(**connection_properties) \
.load()

df.show()

If this works, the issue is confirmed as an algorithm mismatch. If it still fails, try changing the encryption_types_client and crypto_checksum_types_client values to match what your server supports.


STEP 5: CHECK FOR ORACLE RDS-SPECIFIC CONSIDERATIONS

Amazon RDS for Oracle has specific behavior around NNE:

1. NNE configuration is done through RDS option groups (not sqlnet.ora directly). Make sure the NATIVE_NETWORK_ENCRYPTION option is added to your RDS option group with compatible algorithms.

2. In the RDS option group, look for these settings under NATIVE_NETWORK_ENCRYPTION:
- SQLNET.ENCRYPTION_SERVER
- SQLNET.ENCRYPTION_TYPES_SERVER
- SQLNET.CRYPTO_CHECKSUM_SERVER
- SQLNET.CRYPTO_CHECKSUM_TYPES_SERVER

3. RDS documentation for NNE configuration:
https://docs.aws.amazon.com/AmazonRDS/latest/UserGuide/Appendix.Oracle.Options.NetworkEncryption.htm...


STEP 6: ENABLE JDBC DRIVER TRACING (ADDITIONAL DIAGNOSTICS)

For more detail on what is happening during the connection attempt, you can enable Oracle JDBC thin driver tracing on your Databricks cluster. Add these Spark config properties to your cluster configuration:

spark.driver.extraJavaOptions -Doracle.jdbc.Trace=true -Djava.util.logging.config.file=/tmp/ojdbc_logging.properties

Create the logging properties file in an init script:

cat > /tmp/ojdbc_logging.properties << 'EOF'
.level=FINE
handlers=java.util.logging.FileHandler
java.util.logging.FileHandler.pattern=/tmp/ojdbc_trace.log
java.util.logging.FileHandler.limit=50000000
java.util.logging.FileHandler.count=1
java.util.logging.FileHandler.formatter=java.util.logging.SimpleFormatter
EOF

After the failed connection attempt, check /tmp/ojdbc_trace.log on the driver for detailed NNE negotiation messages.


ADDITIONAL NOTES

- The Databricks documentation confirms that Lakehouse Federation uses NNE (not TLS) for non-Oracle Cloud connections: https://docs.databricks.com/aws/en/query-federation/oracle

- Make sure your Oracle RDS security group allows inbound traffic on port 1521 from the Databricks stable egress IPs (which you mentioned you have already done).

- If you are using a SQL warehouse (serverless), make sure you have configured a Network Connectivity Configuration (NCC) and that the stable NAT IPs are whitelisted on the RDS side: https://docs.databricks.com/aws/en/security/network/serverless-network-security/serverless-firewall....

- The "Socket read timed out" variant you saw with the Maven ojdbc driver points to the same root cause: the driver is waiting for the server to respond to its authentication/encryption handshake, but the server has already closed the socket due to algorithm negotiation failure.

If you have tried all the above and are still seeing the issue, I would recommend opening a Databricks support ticket. The support team can inspect the serverless/cluster-side connection logs at a deeper level and help identify the exact negotiation failure point.

* This reply used an agent system I built to research and draft this response based on the wide set of documentation I have available and previous memory. I personally review the draft for any obvious issues and for monitoring system reliability and update it when I detect any drift, but there is still a small chance that something is inaccurate, especially if you are experimenting with brand new features.

View solution in original post

5 REPLIES 5

Saritha_S
Databricks Employee
Databricks Employee

Hi @luketl2 Could you please provide us the complete stack trace? 

A few notes: we know that my IP from the cluster I am testing on is whitelisted, and I have run nc -vz to their IP and port and it returns "open," so I am confident we establish tcp. Nothing in stderr on driver, but here is what I see in log4j:

26/02/03 15:19:41 INFO ClusterLoadMonitor: Removed query with execution ID:9. Current active queries:0
26/02/03 15:19:41 INFO SQLExecution:  0 QueryExecution(s) are running
26/02/03 15:19:41 INFO CurrentQueryContext: Thread Thread[WRAPPER-ReplId-19c23-fe347-5,5,main]: current category Some(EXECUTABLE_COMMAND), restoring to previous category Some(UNDETERMINED).
26/02/03 15:19:41 ERROR SQLDriverLocal: Error in SQL query: -- This is a system generated query from catalog explorer
CREATE FOREIGN CATALOG `foreign_data` USING CONNECTION `foreign_db` OPTIONS (service_name 'db_a', test_catalog 'true')
org.apache.spark.SparkIllegalArgumentException: [CANNOT_ESTABLISH_CONNECTION] Cannot establish connection to remote ORACLE database. Please check connection information and credentials e.g. host, port, user, password and database options. ** If you believe the information is correct, please check your workspace's network setup and ensure it does not have outbound restrictions to the host. Please also check that the host does not block inbound connections from the network where the workspace's Spark clusters are deployed. ** Detailed error message: ORA-17002: I/O error: Socket read interrupted, Authentication lapse 0 ms.
at org.apache.spark.sql.errors.QueryExecutionErrors$.cannotEstablishConnectionError(QueryExecutionErrors.scala:1278)
at org.apache.spark.sql.execution.datasources.jdbc.JDBCDialectTestConnection.testCatalogConnection(JDBCTestConnection.scala:86)
at com.databricks.sql.managedcatalog.command.QueryFederationCommand$.testCatalogConnection(queryFederationCommandsExec.scala:153)
at com.databricks.sql.managedcatalog.command.TestForeignCatalogConnectionCommand.run(queryFederationCommandsExec.scala:302)
at org.apache.spark.sql.execution.command.ExecutedCommandExec.$anonfun$sideEffectResult$2(commands.scala:84)
at org.apache.spark.sql.execution.SparkPlan.runCommandInAetherOrSpark(SparkPlan.scala:194)
at org.apache.spark.sql.execution.command.ExecutedCommandExec.$anonfun$sideEffectResult$1(commands.scala:84)
at com.databricks.spark.util.FrameProfiler$.record(FrameProfiler.scala:94)
at org.apache.spark.sql.execution.command.ExecutedCommandExec.sideEffectResult$lzycompute(commands.scala:81)
at org.apache.spark.sql.execution.command.ExecutedCommandExec.sideEffectResult(commands.scala:80)
at org.apache.spark.sql.execution.command.ExecutedCommandExec.executeCollect(commands.scala:94)
at org.apache.spark.sql.execution.QueryExecution.$anonfun$eagerlyExecuteCommands$5(QueryExecution.scala:553)
at com.databricks.util.LexicalThreadLocal$Handle.runWith(LexicalThreadLocal.scala:63)
at org.apache.spark.sql.execution.QueryExecution.$anonfun$eagerlyExecuteCommands$4(QueryExecution.scala:553)
at org.apache.spark.sql.catalyst.QueryPlanningTracker$.withTracker(QueryPlanningTracker.scala:233)
at org.apache.spark.sql.execution.QueryExecution.$anonfun$eagerlyExecuteCommands$3(QueryExecution.scala:552)
at org.apache.spark.sql.execution.SQLExecution$.$anonfun$withNewExecutionId0$11(SQLExecution.scala:486)
at com.databricks.sql.util.MemoryTrackerHelper.withMemoryTracking(MemoryTrackerHelper.scala:80)
at org.apache.spark.sql.execution.SQLExecution$.$anonfun$withNewExecutionId0$10(SQLExecution.scala:422)
at org.apache.spark.sql.execution.SQLExecution$.withSQLConfPropagated(SQLExecution.scala:810)
at org.apache.spark.sql.execution.SQLExecution$.$anonfun$withNewExecutionId0$1(SQLExecution.scala:352)
at org.apache.spark.sql.SparkSession.withActive(SparkSession.scala:1481)
at org.apache.spark.sql.execution.SQLExecution$.withNewExecutionId0(SQLExecution.scala:217)
at org.apache.spark.sql.execution.SQLExecution$.withNewExecutionId(SQLExecution.scala:747)
at org.apache.spark.sql.execution.QueryExecution.$anonfun$eagerlyExecuteCommands$2(QueryExecution.scala:548)
at org.apache.spark.sql.execution.QueryExecution$.withInternalError(QueryExecution.scala:1426)
at org.apache.spark.sql.execution.QueryExecution.$anonfun$eagerlyExecuteCommands$1(QueryExecution.scala:544)
at org.apache.spark.sql.execution.QueryExecution.withMVTagsIfNecessary(QueryExecution.scala:473)
at org.apache.spark.sql.execution.QueryExecution.org$apache$spark$sql$execution$QueryExecution$$eagerlyExecute$1(QueryExecution.scala:542)
at org.apache.spark.sql.execution.QueryExecution$$anonfun$$nestedInanonfun$eagerlyExecuteCommands$8$1.applyOrElse(QueryExecution.scala:621)
at org.apache.spark.sql.execution.QueryExecution$$anonfun$$nestedInanonfun$eagerlyExecuteCommands$8$1.applyOrElse(QueryExecution.scala:616)
at org.apache.spark.sql.catalyst.trees.TreeNode.$anonfun$transformDownWithPruning$1(TreeNode.scala:521)
at org.apache.spark.sql.catalyst.trees.CurrentOrigin$.withOrigin(origin.scala:85)
at org.apache.spark.sql.catalyst.trees.TreeNode.transformDownWithPruning(TreeNode.scala:521)
at org.apache.spark.sql.catalyst.plans.logical.LogicalPlan.org$apache$spark$sql$catalyst$plans$logical$AnalysisHelper$$super$transformDownWithPruning(LogicalPlan.scala:42)
at org.apache.spark.sql.catalyst.plans.logical.AnalysisHelper.transformDownWithPruning(AnalysisHelper.scala:361)
at org.apache.spark.sql.catalyst.plans.logical.AnalysisHelper.transformDownWithPruning$(AnalysisHelper.scala:357)
at org.apache.spark.sql.catalyst.plans.logical.LogicalPlan.transformDownWithPruning(LogicalPlan.scala:42)
at org.apache.spark.sql.catalyst.plans.logical.LogicalPlan.transformDownWithPruning(LogicalPlan.scala:42)
at org.apache.spark.sql.catalyst.trees.TreeNode.transformDown(TreeNode.scala:497)
at org.apache.spark.sql.execution.QueryExecution.$anonfun$eagerlyExecuteCommands$8(QueryExecution.scala:616)
at org.apache.spark.sql.catalyst.plans.logical.AnalysisHelper$.allowInvokingTransformsInAnalyzer(AnalysisHelper.scala:418)
at org.apache.spark.sql.execution.QueryExecution.eagerlyExecuteCommands(QueryExecution.scala:616)
at org.apache.spark.sql.execution.QueryExecution.$anonfun$lazyCommandExecuted$1(QueryExecution.scala:429)
at scala.util.Try$.apply(Try.scala:213)
at org.apache.spark.util.Utils$.doTryWithCallerStacktrace(Utils.scala:1684)
at org.apache.spark.util.Utils$.getTryWithCallerStacktrace(Utils.scala:1745)
at org.apache.spark.util.LazyTry.get(LazyTry.scala:58)
at org.apache.spark.sql.execution.QueryExecution.commandExecuted(QueryExecution.scala:434)
at org.apache.spark.sql.Dataset.<init>(Dataset.scala:406)
at org.apache.spark.sql.Dataset$.$anonfun$ofRows$3(Dataset.scala:150)
at org.apache.spark.sql.SparkSession.withActive(SparkSession.scala:1481)
at org.apache.spark.sql.SparkSession.$anonfun$withActiveAndFrameProfiler$1(SparkSession.scala:1488)
at com.databricks.spark.util.FrameProfiler$.record(FrameProfiler.scala:94)
at org.apache.spark.sql.SparkSession.withActiveAndFrameProfiler(SparkSession.scala:1488)
at org.apache.spark.sql.Dataset$.ofRows(Dataset.scala:141)
at org.apache.spark.sql.SparkSession.$anonfun$sql$4(SparkSession.scala:1161)
at org.apache.spark.sql.SparkSession.withActive(SparkSession.scala:1481)
at org.apache.spark.sql.SparkSession.sql(SparkSession.scala:1113)
at org.apache.spark.sql.SparkSession.sql(SparkSession.scala:1185)
at com.databricks.backend.daemon.driver.DriverLocal$DbClassicStrategy.executeSQLQuery(DriverLocal.scala:372)
at com.databricks.backend.daemon.driver.DriverLocal.executeSQLSubCommand(DriverLocal.scala:472)
at com.databricks.backend.daemon.driver.DriverLocal.$anonfun$executeSql$1(DriverLocal.scala:496)
at scala.collection.immutable.List.map(List.scala:293)
at com.databricks.backend.daemon.driver.DriverLocal.executeSql(DriverLocal.scala:491)
at com.databricks.backend.daemon.driver.SQLDriverLocal.repl(SQLDriverLocal.scala:39)
at com.databricks.backend.daemon.driver.DriverLocal.$anonfun$execute$36(DriverLocal.scala:1321)
at com.databricks.unity.UCSEphemeralState$Handle.runWith(UCSEphemeralState.scala:51)
at com.databricks.unity.HandleImpl.runWith(UCSHandle.scala:104)
at com.databricks.backend.daemon.driver.DriverLocal.$anonfun$execute$30(DriverLocal.scala:1312)
at com.databricks.logging.AttributionContextTracing.$anonfun$withAttributionContext$1(AttributionContextTracing.scala:49)
at com.databricks.logging.AttributionContext$.$anonfun$withValue$1(AttributionContext.scala:293)
at scala.util.DynamicVariable.withValue(DynamicVariable.scala:62)
at com.databricks.logging.AttributionContext$.withValue(AttributionContext.scala:289)
at com.databricks.logging.AttributionContextTracing.withAttributionContext(AttributionContextTracing.scala:47)
at com.databricks.logging.AttributionContextTracing.withAttributionContext$(AttributionContextTracing.scala:44)
at com.databricks.backend.daemon.driver.DriverLocal.withAttributionContext(DriverLocal.scala:130)
at com.databricks.logging.AttributionContextTracing.withAttributionTags(AttributionContextTracing.scala:96)
at com.databricks.logging.AttributionContextTracing.withAttributionTags$(AttributionContextTracing.scala:77)
at com.databricks.backend.daemon.driver.DriverLocal.withAttributionTags(DriverLocal.scala:130)
at com.databricks.backend.daemon.driver.DriverLocal.$anonfun$execute$1(DriverLocal.scala:1236)
at com.databricks.backend.daemon.driver.DriverLocal$.$anonfun$maybeSynchronizeExecution$4(DriverLocal.scala:1721)
at com.databricks.backend.daemon.driver.DriverLocal.execute(DriverLocal.scala:879)
at com.databricks.backend.daemon.driver.DriverWrapper.$anonfun$tryExecutingCommand$2(DriverWrapper.scala:1054)
at scala.util.Try$.apply(Try.scala:213)
at com.databricks.backend.daemon.driver.DriverWrapper.$anonfun$tryExecutingCommand$1(DriverWrapper.scala:1043)
at com.databricks.backend.daemon.driver.DriverWrapper.$anonfun$tryExecutingCommand$3(DriverWrapper.scala:1089)
at com.databricks.logging.UsageLogging.executeThunkAndCaptureResultTags$1(UsageLogging.scala:616)
at com.databricks.logging.UsageLogging.$anonfun$recordOperationWithResultTags$4(UsageLogging.scala:643)
at com.databricks.logging.AttributionContextTracing.$anonfun$withAttributionContext$1(AttributionContextTracing.scala:49)
at com.databricks.logging.AttributionContext$.$anonfun$withValue$1(AttributionContext.scala:293)
at scala.util.DynamicVariable.withValue(DynamicVariable.scala:62)
at com.databricks.logging.AttributionContext$.withValue(AttributionContext.scala:289)
at com.databricks.logging.AttributionContextTracing.withAttributionContext(AttributionContextTracing.scala:47)
at com.databricks.logging.AttributionContextTracing.withAttributionContext$(AttributionContextTracing.scala:44)
at com.databricks.backend.daemon.driver.DriverWrapper.withAttributionContext(DriverWrapper.scala:81)
at com.databricks.logging.AttributionContextTracing.withAttributionTags(AttributionContextTracing.scala:96)
at com.databricks.logging.AttributionContextTracing.withAttributionTags$(AttributionContextTracing.scala:77)
at com.databricks.backend.daemon.driver.DriverWrapper.withAttributionTags(DriverWrapper.scala:81)
at com.databricks.logging.UsageLogging.recordOperationWithResultTags(UsageLogging.scala:611)
at com.databricks.logging.UsageLogging.recordOperationWithResultTags$(UsageLogging.scala:519)
at com.databricks.backend.daemon.driver.DriverWrapper.recordOperationWithResultTags(DriverWrapper.scala:81)
at com.databricks.backend.daemon.driver.DriverWrapper.tryExecutingCommand(DriverWrapper.scala:1089)
at com.databricks.backend.daemon.driver.DriverWrapper.executeCommandAndGetError(DriverWrapper.scala:766)
at com.databricks.backend.daemon.driver.DriverWrapper.executeCommand(DriverWrapper.scala:859)
at com.databricks.backend.daemon.driver.DriverWrapper.$anonfun$runInnerLoop$1(DriverWrapper.scala:630)
at scala.runtime.java8.JFunction0$mcV$sp.apply(JFunction0$mcV$sp.java:23)
at com.databricks.logging.AttributionContextTracing.$anonfun$withAttributionContext$1(AttributionContextTracing.scala:49)
at com.databricks.logging.AttributionContext$.$anonfun$withValue$1(AttributionContext.scala:293)
at scala.util.DynamicVariable.withValue(DynamicVariable.scala:62)
at com.databricks.logging.AttributionContext$.withValue(AttributionContext.scala:289)
at com.databricks.logging.AttributionContextTracing.withAttributionContext(AttributionContextTracing.scala:47)
at com.databricks.logging.AttributionContextTracing.withAttributionContext$(AttributionContextTracing.scala:44)
at com.databricks.backend.daemon.driver.DriverWrapper.withAttributionContext(DriverWrapper.scala:81)
at com.databricks.backend.daemon.driver.DriverWrapper.runInnerLoop(DriverWrapper.scala:625)
at com.databricks.backend.daemon.driver.DriverWrapper.runInner(DriverWrapper.scala:548)
at com.databricks.backend.daemon.driver.DriverWrapper.run(DriverWrapper.scala:373)
at java.base/java.lang.Thread.run(Thread.java:840)
Suppressed: org.apache.spark.util.Utils$OriginalTryStackTraceException: Full stacktrace of original doTryWithCallerStacktrace caller
at org.apache.spark.sql.errors.QueryExecutionErrors$.cannotEstablishConnectionError(QueryExecutionErrors.scala:1278)
at org.apache.spark.sql.execution.datasources.jdbc.JDBCDialectTestConnection.testCatalogConnection(JDBCTestConnection.scala:86)
at com.databricks.sql.managedcatalog.command.QueryFederationCommand$.testCatalogConnection(queryFederationCommandsExec.scala:153)
at com.databricks.sql.managedcatalog.command.TestForeignCatalogConnectionCommand.run(queryFederationCommandsExec.scala:302)
at org.apache.spark.sql.execution.command.ExecutedCommandExec.$anonfun$sideEffectResult$2(commands.scala:84)
at org.apache.spark.sql.execution.SparkPlan.runCommandInAetherOrSpark(SparkPlan.scala:194)
at org.apache.spark.sql.execution.command.ExecutedCommandExec.$anonfun$sideEffectResult$1(commands.scala:84)
at com.databricks.spark.util.FrameProfiler$.record(FrameProfiler.scala:94)
at org.apache.spark.sql.execution.command.ExecutedCommandExec.sideEffectResult$lzycompute(commands.scala:81)
at org.apache.spark.sql.execution.command.ExecutedCommandExec.sideEffectResult(commands.scala:80)
at org.apache.spark.sql.execution.command.ExecutedCommandExec.executeCollect(commands.scala:94)
at org.apache.spark.sql.execution.QueryExecution.$anonfun$eagerlyExecuteCommands$5(QueryExecution.scala:553)
at com.databricks.util.LexicalThreadLocal$Handle.runWith(LexicalThreadLocal.scala:63)
at org.apache.spark.sql.execution.QueryExecution.$anonfun$eagerlyExecuteCommands$4(QueryExecution.scala:553)
at org.apache.spark.sql.catalyst.QueryPlanningTracker$.withTracker(QueryPlanningTracker.scala:233)
at org.apache.spark.sql.execution.QueryExecution.$anonfun$eagerlyExecuteCommands$3(QueryExecution.scala:552)
at org.apache.spark.sql.execution.SQLExecution$.$anonfun$withNewExecutionId0$11(SQLExecution.scala:486)
at com.databricks.sql.util.MemoryTrackerHelper.withMemoryTracking(MemoryTrackerHelper.scala:80)
at org.apache.spark.sql.execution.SQLExecution$.$anonfun$withNewExecutionId0$10(SQLExecution.scala:422)
at org.apache.spark.sql.execution.SQLExecution$.withSQLConfPropagated(SQLExecution.scala:810)
at org.apache.spark.sql.execution.SQLExecution$.$anonfun$withNewExecutionId0$1(SQLExecution.scala:352)
at org.apache.spark.sql.SparkSession.withActive(SparkSession.scala:1481)
at org.apache.spark.sql.execution.SQLExecution$.withNewExecutionId0(SQLExecution.scala:217)
at org.apache.spark.sql.execution.SQLExecution$.withNewExecutionId(SQLExecution.scala:747)
at org.apache.spark.sql.execution.QueryExecution.$anonfun$eagerlyExecuteCommands$2(QueryExecution.scala:548)
at org.apache.spark.sql.execution.QueryExecution$.withInternalError(QueryExecution.scala:1426)
at org.apache.spark.sql.execution.QueryExecution.$anonfun$eagerlyExecuteCommands$1(QueryExecution.scala:544)
at org.apache.spark.sql.execution.QueryExecution.withMVTagsIfNecessary(QueryExecution.scala:473)
at org.apache.spark.sql.execution.QueryExecution.org$apache$spark$sql$execution$QueryExecution$$eagerlyExecute$1(QueryExecution.scala:542)
at org.apache.spark.sql.execution.QueryExecution$$anonfun$$nestedInanonfun$eagerlyExecuteCommands$8$1.applyOrElse(QueryExecution.scala:621)
at org.apache.spark.sql.execution.QueryExecution$$anonfun$$nestedInanonfun$eagerlyExecuteCommands$8$1.applyOrElse(QueryExecution.scala:616)
at org.apache.spark.sql.catalyst.trees.TreeNode.$anonfun$transformDownWithPruning$1(TreeNode.scala:521)
at org.apache.spark.sql.catalyst.trees.CurrentOrigin$.withOrigin(origin.scala:85)
at org.apache.spark.sql.catalyst.trees.TreeNode.transformDownWithPruning(TreeNode.scala:521)
at org.apache.spark.sql.catalyst.plans.logical.LogicalPlan.org$apache$spark$sql$catalyst$plans$logical$AnalysisHelper$$super$transformDownWithPruning(LogicalPlan.scala:42)
at org.apache.spark.sql.catalyst.plans.logical.AnalysisHelper.transformDownWithPruning(AnalysisHelper.scala:361)
at org.apache.spark.sql.catalyst.plans.logical.AnalysisHelper.transformDownWithPruning$(AnalysisHelper.scala:357)
at org.apache.spark.sql.catalyst.plans.logical.LogicalPlan.transformDownWithPruning(LogicalPlan.scala:42)
at org.apache.spark.sql.catalyst.plans.logical.LogicalPlan.transformDownWithPruning(LogicalPlan.scala:42)
at org.apache.spark.sql.catalyst.trees.TreeNode.transformDown(TreeNode.scala:497)
at org.apache.spark.sql.execution.QueryExecution.$anonfun$eagerlyExecuteCommands$8(QueryExecution.scala:616)
at org.apache.spark.sql.catalyst.plans.logical.AnalysisHelper$.allowInvokingTransformsInAnalyzer(AnalysisHelper.scala:418)
at org.apache.spark.sql.execution.QueryExecution.eagerlyExecuteCommands(QueryExecution.scala:616)
at org.apache.spark.sql.execution.QueryExecution.$anonfun$lazyCommandExecuted$1(QueryExecution.scala:429)
at scala.util.Try$.apply(Try.scala:213)
at org.apache.spark.util.Utils$.doTryWithCallerStacktrace(Utils.scala:1684)
at org.apache.spark.util.LazyTry.tryT$lzycompute(LazyTry.scala:46)
at org.apache.spark.util.LazyTry.tryT(LazyTry.scala:46)
... 71 more
Caused by: java.sql.SQLRecoverableException: ORA-17002: I/O error: Socket read interrupted, Authentication lapse 0 ms.
at shaded.databricks.oracle.jdbc.driver.T4CConnection.handleLogonIOException(T4CConnection.java:1644)
at shaded.databricks.oracle.jdbc.driver.T4CConnection.handleLogonInterruptedIOException(T4CConnection.java:1505)
at shaded.databricks.oracle.jdbc.driver.T4CConnection.logon(T4CConnection.java:1118)
at shaded.databricks.oracle.jdbc.driver.PhysicalConnection.connect(PhysicalConnection.java:1175)
at shaded.databricks.oracle.jdbc.driver.T4CDriverExtension.getConnection(T4CDriverExtension.java:105)
at shaded.databricks.oracle.jdbc.driver.OracleDriver.connect(OracleDriver.java:886)
at shaded.databricks.oracle.jdbc.driver.OracleDriver.connect(OracleDriver.java:693)
at org.apache.spark.sql.execution.datasources.jdbc.connection.BasicConnectionProvider.getConnection(BasicConnectionProvider.scala:50)
at org.apache.spark.sql.execution.datasources.jdbc.connection.ConnectionProviderBase.create(ConnectionProvider.scala:102)
at org.apache.spark.sql.jdbc.JdbcDialect.$anonfun$createConnectionFactory$1(JdbcDialects.scala:270)
at org.apache.spark.sql.jdbc.JdbcDialect.$anonfun$createConnectionFactory$1$adapted(JdbcDialects.scala:266)
at org.apache.spark.sql.execution.datasources.jdbc.JdbcUtils$.withConnection(JdbcUtils.scala:1373)
at org.apache.spark.sql.execution.datasources.jdbc.JDBCDialectTestConnection.testCatalogConnection(JDBCTestConnection.scala:83)
at com.databricks.sql.managedcatalog.command.QueryFederationCommand$.testCatalogConnection(queryFederationCommandsExec.scala:153)
at com.databricks.sql.managedcatalog.command.TestForeignCatalogConnectionCommand.run(queryFederationCommandsExec.scala:302)
at org.apache.spark.sql.execution.command.ExecutedCommandExec.$anonfun$sideEffectResult$2(commands.scala:84)
at org.apache.spark.sql.execution.SparkPlan.runCommandInAetherOrSpark(SparkPlan.scala:194)
at org.apache.spark.sql.execution.command.ExecutedCommandExec.$anonfun$sideEffectResult$1(commands.scala:84)
at com.databricks.spark.util.FrameProfiler$.record(FrameProfiler.scala:94)
at org.apache.spark.sql.execution.command.ExecutedCommandExec.sideEffectResult$lzycompute(commands.scala:81)
at org.apache.spark.sql.execution.command.ExecutedCommandExec.sideEffectResult(commands.scala:80)
at org.apache.spark.sql.execution.command.ExecutedCommandExec.executeCollect(commands.scala:94)
at org.apache.spark.sql.execution.QueryExecution.$anonfun$eagerlyExecuteCommands$5(QueryExecution.scala:553)
at com.databricks.util.LexicalThreadLocal$Handle.runWith(LexicalThreadLocal.scala:63)
at org.apache.spark.sql.execution.QueryExecution.$anonfun$eagerlyExecuteCommands$4(QueryExecution.scala:553)
at org.apache.spark.sql.catalyst.QueryPlanningTracker$.withTracker(QueryPlanningTracker.scala:233)
at org.apache.spark.sql.execution.QueryExecution.$anonfun$eagerlyExecuteCommands$3(QueryExecution.scala:552)
at org.apache.spark.sql.execution.SQLExecution$.$anonfun$withNewExecutionId0$11(SQLExecution.scala:486)
at com.databricks.sql.util.MemoryTrackerHelper.withMemoryTracking(MemoryTrackerHelper.scala:80)
at org.apache.spark.sql.execution.SQLExecution$.$anonfun$withNewExecutionId0$10(SQLExecution.scala:422)
at org.apache.spark.sql.execution.SQLExecution$.withSQLConfPropagated(SQLExecution.scala:810)
at org.apache.spark.sql.execution.SQLExecution$.$anonfun$withNewExecutionId0$1(SQLExecution.scala:352)
at org.apache.spark.sql.SparkSession.withActive(SparkSession.scala:1481)
at org.apache.spark.sql.execution.SQLExecution$.withNewExecutionId0(SQLExecution.scala:217)
at org.apache.spark.sql.execution.SQLExecution$.withNewExecutionId(SQLExecution.scala:747)
at org.apache.spark.sql.execution.QueryExecution.$anonfun$eagerlyExecuteCommands$2(QueryExecution.scala:548)
at org.apache.spark.sql.execution.QueryExecution$.withInternalError(QueryExecution.scala:1426)
at org.apache.spark.sql.execution.QueryExecution.$anonfun$eagerlyExecuteCommands$1(QueryExecution.scala:544)
at org.apache.spark.sql.execution.QueryExecution.withMVTagsIfNecessary(QueryExecution.scala:473)
at org.apache.spark.sql.execution.QueryExecution.org$apache$spark$sql$execution$QueryExecution$$eagerlyExecute$1(QueryExecution.scala:542)
at org.apache.spark.sql.execution.QueryExecution$$anonfun$$nestedInanonfun$eagerlyExecuteCommands$8$1.applyOrElse(QueryExecution.scala:621)
at org.apache.spark.sql.execution.QueryExecution$$anonfun$$nestedInanonfun$eagerlyExecuteCommands$8$1.applyOrElse(QueryExecution.scala:616)
at org.apache.spark.sql.catalyst.trees.TreeNode.$anonfun$transformDownWithPruning$1(TreeNode.scala:521)
at org.apache.spark.sql.catalyst.trees.CurrentOrigin$.withOrigin(origin.scala:85)
at org.apache.spark.sql.catalyst.trees.TreeNode.transformDownWithPruning(TreeNode.scala:521)
at org.apache.spark.sql.catalyst.plans.logical.LogicalPlan.org$apache$spark$sql$catalyst$plans$logical$AnalysisHelper$$super$transformDownWithPruning(LogicalPlan.scala:42)
at org.apache.spark.sql.catalyst.plans.logical.AnalysisHelper.transformDownWithPruning(AnalysisHelper.scala:361)
at org.apache.spark.sql.catalyst.plans.logical.AnalysisHelper.transformDownWithPruning$(AnalysisHelper.scala:357)
at org.apache.spark.sql.catalyst.plans.logical.LogicalPlan.transformDownWithPruning(LogicalPlan.scala:42)
at org.apache.spark.sql.catalyst.plans.logical.LogicalPlan.transformDownWithPruning(LogicalPlan.scala:42)
at org.apache.spark.sql.catalyst.trees.TreeNode.transformDown(TreeNode.scala:497)
at org.apache.spark.sql.execution.QueryExecution.$anonfun$eagerlyExecuteCommands$8(QueryExecution.scala:616)
at org.apache.spark.sql.catalyst.plans.logical.AnalysisHelper$.allowInvokingTransformsInAnalyzer(AnalysisHelper.scala:418)
at org.apache.spark.sql.execution.QueryExecution.eagerlyExecuteCommands(QueryExecution.scala:616)
at org.apache.spark.sql.execution.QueryExecution.$anonfun$lazyCommandExecuted$1(QueryExecution.scala:429)
at scala.util.Try$.apply(Try.scala:213)
at org.apache.spark.util.Utils$.doTryWithCallerStacktrace(Utils.scala:1684)
at org.apache.spark.util.LazyTry.tryT$lzycompute(LazyTry.scala:46)
at org.apache.spark.util.LazyTry.tryT(LazyTry.scala:46)
... 71 more
Caused by: java.io.IOException: Socket read interrupted, Authentication lapse 0 ms.
at shaded.databricks.oracle.jdbc.driver.T4CConnection.handleLogonIOException(T4CConnection.java:1639)
... 129 more
Caused by: java.io.InterruptedIOException: Socket read interrupted
at shaded.databricks.oracle.net.nt.TimeoutSocketChannel.doBlockedRead(TimeoutSocketChannel.java:612)
at shaded.databricks.oracle.net.nt.TimeoutSocketChannel.read(TimeoutSocketChannel.java:536)
at shaded.databricks.oracle.net.ns.NSProtocolNIO.doSocketRead(NSProtocolNIO.java:1224)
at shaded.databricks.oracle.net.ns.NIOPacket.readNIOPacket(NIOPacket.java:436)
at shaded.databricks.oracle.net.ns.NSProtocolNIO.negotiateConnection(NSProtocolNIO.java:216)
at shaded.databricks.oracle.net.ns.NSProtocol.connect(NSProtocol.java:352)
at shaded.databricks.oracle.jdbc.driver.T4CConnection.connectNetworkSessionProtocol(T4CConnection.java:3180)
at shaded.databricks.oracle.jdbc.driver.T4CConnection.logon(T4CConnection.java:1002)
... 127 more
26/02/03 15:19:41 INFO ProgressReporter$: Removed result fetcher for 1770130388085_8294809462390751836_c30cf945-7a17-445a-be87-4f94a23a64f9
26/02/03 15:19:41 INFO QueryProfileListener: Query profile sent to logger, seq number: 9, app id: app-20260203145333-0000

I also tried this using ojdbc driver from Maven instead of federation, got this stack trace in log4j:

Py4JJavaError: An error occurred while calling o503.load. : java.sql.SQLRecoverableException: IO Error: Socket read timed out, Authentication lapse 0 ms. at oracle.jdbc.driver.T4CConnection.logon(T4CConnection.java:874) at oracle.jdbc.driver.PhysicalConnection.connect(PhysicalConnection.java:793) at oracle.jdbc.driver.T4CDriverExtension.getConnection(T4CDriverExtension.java:57) at oracle.jdbc.driver.OracleDriver.connect(OracleDriver.java:747) at oracle.jdbc.driver.OracleDriver.connect(OracleDriver.java:562) at org.apache.spark.sql.execution.datasources.jdbc.connection.BasicConnectionProvider.getConnection(BasicConnectionProvider.scala:50) at org.apache.spark.sql.execution.datasources.jdbc.connection.ConnectionProviderBase.create(ConnectionProvider.scala:102) at org.apache.spark.sql.jdbc.JdbcDialect.$anonfun$createConnectionFactory$2(JdbcDialects.scala:269) at org.apache.spark.sql.catalyst.MetricKeyUtils$.measure(MetricKey.scala:2259) at org.apache.spark.sql.jdbc.JdbcDialect.$anonfun$createConnectionFactory$1(JdbcDialects.scala:269) at org.apache.spark.sql.jdbc.JdbcDialect.$anonfun$createConnectionFactory$1$adapted(JdbcDialects.scala:265) at org.apache.spark.sql.execution.datasources.jdbc.JdbcUtils$.withConnection(JdbcUtils.scala:1383) at org.apache.spark.sql.execution.datasources.jdbc.JDBCRDD$.resolveTable(JDBCRDD.scala:90) at org.apache.spark.sql.execution.datasources.jdbc.JDBCRelation$.getSchema(JDBCRelation.scala:256) at org.apache.spark.sql.execution.datasources.jdbc.JdbcRelationProvider.$anonfun$createRelation$1(JdbcRelationProvider.scala:50) at org.apache.spark.sql.execution.metric.SQLMetrics$.withTimingNs(SQLMetrics.scala:493) at org.apache.spark.sql.execution.datasources.jdbc.JdbcRelationProvider.createRelation(JdbcRelationProvider.scala:50) at org.apache.spark.sql.execution.datasources.DataSource.resolveRelation(DataSource.scala:455) at org.apache.spark.sql.catalyst.analysis.ResolveDataSource.org$apache$spark$sql$catalyst$analysis$ResolveDataSource$$loadV1BatchSource(ResolveDataSource.scala:277) at org.apache.spark.sql.catalyst.analysis.ResolveDataSource$$anonfun$apply$1.$anonfun$applyOrElse$5(ResolveDataSource.scala:114) at scala.Option.getOrElse(Option.scala:201) at org.apache.spark.sql.catalyst.analysis.ResolveDataSource$$anonfun$apply$1.applyOrElse(ResolveDataSource.scala:113) at org.apache.spark.sql.catalyst.analysis.ResolveDataSource$$anonfun$apply$1.applyOrElse(ResolveDataSource.scala:60) at org.apache.spark.sql.catalyst.plans.logical.AnalysisHelper.$anonfun$resolveOperatorsUpWithPruning$3(AnalysisHelper.scala:141) at org.apache.spark.sql.catalyst.trees.CurrentOrigin$.withOrigin(origin.scala:142) at org.apache.spark.sql.catalyst.plans.logical.AnalysisHelper.$anonfun$resolveOperatorsUpWithPruning$1(AnalysisHelper.scala:141) at org.apache.spark.sql.catalyst.plans.logical.AnalysisHelper$.allowInvokingTransformsInAnalyzer(AnalysisHelper.scala:418) at org.apache.spark.sql.catalyst.plans.logical.AnalysisHelper.resolveOperatorsUpWithPruning(AnalysisHelper.scala:137) at org.apache.spark.sql.catalyst.plans.logical.AnalysisHelper.resolveOperatorsUpWithPruning$(AnalysisHelper.scala:133) at org.apache.spark.sql.catalyst.plans.logical.LogicalPlan.resolveOperatorsUpWithPruning(LogicalPlan.scala:47) at org.apache.spark.sql.catalyst.plans.logical.AnalysisHelper.resolveOperatorsUp(AnalysisHelper.scala:114) at org.apache.spark.sql.catalyst.plans.logical.AnalysisHelper.resolveOperatorsUp$(AnalysisHelper.scala:113) at org.apache.spark.sql.catalyst.plans.logical.LogicalPlan.resolveOperatorsUp(LogicalPlan.scala:47) at org.apache.spark.sql.catalyst.analysis.ResolveDataSource.apply(ResolveDataSource.scala:60) at org.apache.spark.sql.catalyst.analysis.ResolveDataSource.apply(ResolveDataSource.scala:58) at org.apache.spark.sql.catalyst.rules.RuleExecutor.$anonfun$execute$17(RuleExecutor.scala:509) at org.apache.spark.sql.catalyst.rules.RecoverableRuleExecutionHelper.processRule(RuleExecutor.scala:663) at org.apache.spark.sql.catalyst.rules.RecoverableRuleExecutionHelper.processRule$(RuleExecutor.scala:647) at org.apache.spark.sql.catalyst.rules.RuleExecutor.processRule(RuleExecutor.scala:143) at org.apache.spark.sql.catalyst.rules.RuleExecutor.$anonfun$execute$16(RuleExecutor.scala:509) at com.databricks.spark.util.MemoryTracker$.withThreadAllocatedBytes(MemoryTracker.scala:51) at org.apache.spark.sql.catalyst.QueryPlanningTracker$.measureRule(QueryPlanningTracker.scala:382) at org.apache.spark.sql.catalyst.rules.RuleExecutor.$anonfun$execute$15(RuleExecutor.scala:507) at com.databricks.spark.util.FrameProfiler$.$anonfun$record$1(FrameProfiler.scala:114) at com.databricks.spark.util.FrameProfilerExporter$.maybeExportFrameProfiler(FrameProfilerExporter.scala:200) at com.databricks.spark.util.FrameProfiler$.record(FrameProfiler.scala:105) at org.apache.spark.sql.catalyst.rules.RuleExecutor.$anonfun$execute$14(RuleExecutor.scala:506) at scala.collection.LinearSeqOps.foldLeft(LinearSeq.scala:183) at scala.collection.LinearSeqOps.foldLeft$(LinearSeq.scala:179) at scala.collection.immutable.List.foldLeft(List.scala:79) at org.apache.spark.sql.catalyst.rules.RuleExecutor.$anonfun$execute$13(RuleExecutor.scala:498) at scala.runtime.java8.JFunction0$mcV$sp.apply(JFunction0$mcV$sp.scala:18) at com.databricks.spark.util.FrameProfiler$.$anonfun$record$1(FrameProfiler.scala:114) at com.databricks.spark.util.FrameProfilerExporter$.maybeExportFrameProfiler(FrameProfilerExporter.scala:200) at com.databricks.spark.util.FrameProfiler$.record(FrameProfiler.scala:105) at org.apache.spark.sql.catalyst.rules.RuleExecutor.executeBatch$1(RuleExecutor.scala:472) at org.apache.spark.sql.catalyst.rules.RuleExecutor.$anonfun$execute$23(RuleExecutor.scala:619) at org.apache.spark.sql.catalyst.rules.RuleExecutor.$anonfun$execute$23$adapted(RuleExecutor.scala:619) at scala.collection.immutable.List.foreach(List.scala:334) at org.apache.spark.sql.catalyst.rules.RuleExecutor.$anonfun$execute$1(RuleExecutor.scala:619) at com.databricks.spark.util.FrameProfiler$.$anonfun$record$1(FrameProfiler.scala:114) at com.databricks.spark.util.FrameProfilerExporter$.maybeExportFrameProfiler(FrameProfilerExporter.scala:200) at com.databricks.spark.util.FrameProfiler$.record(FrameProfiler.scala:105) at org.apache.spark.sql.catalyst.rules.RuleExecutor.execute(RuleExecutor.scala:365) at org.apache.spark.sql.catalyst.analysis.Analyzer.super$execute(Analyzer.scala:734) at org.apache.spark.sql.catalyst.analysis.Analyzer.$anonfun$executeSameContext$1(Analyzer.scala:734) at com.databricks.sql.unity.SAMSnapshotHelper$.visitPlansDuringAnalysis(SAMSnapshotHelper.scala:40) at org.apache.spark.sql.catalyst.analysis.Analyzer.executeSameContext(Analyzer.scala:733) at org.apache.spark.sql.catalyst.analysis.Analyzer.$anonfun$execute$1(Analyzer.scala:706) at org.apache.spark.sql.catalyst.analysis.AnalysisContext$.withNewAnalysisContext(Analyzer.scala:477) at org.apache.spark.sql.catalyst.analysis.Analyzer.execute(Analyzer.scala:706) at org.apache.spark.sql.catalyst.analysis.Analyzer.execute(Analyzer.scala:629) at org.apache.spark.sql.catalyst.rules.RuleExecutor.$anonfun$executeAndTrack$1(RuleExecutor.scala:353) at org.apache.spark.sql.catalyst.QueryPlanningTracker$.withTracker(QueryPlanningTracker.scala:265) at org.apache.spark.sql.catalyst.rules.RuleExecutor.executeAndTrack(RuleExecutor.scala:353) at org.apache.spark.sql.catalyst.analysis.resolver.HybridAnalyzer.resolveInFixedPoint(HybridAnalyzer.scala:438) at org.apache.spark.sql.catalyst.analysis.resolver.HybridAnalyzer.$anonfun$apply$1(HybridAnalyzer.scala:97) at org.apache.spark.sql.catalyst.analysis.resolver.HybridAnalyzer.withTrackedAnalyzerBridgeState(HybridAnalyzer.scala:134) at org.apache.spark.sql.catalyst.analysis.resolver.HybridAnalyzer.apply(HybridAnalyzer.scala:90) at org.apache.spark.sql.catalyst.analysis.Analyzer.$anonfun$executeAndCheck$2(Analyzer.scala:686) at org.apache.spark.sql.catalyst.plans.logical.AnalysisHelper$.markInAnalyzer(AnalysisHelper.scala:425) at org.apache.spark.sql.catalyst.analysis.Analyzer.$anonfun$executeAndCheck$1(Analyzer.scala:686) at com.databricks.sql.unity.SAMSnapshotHelper$.visitPlansDuringAnalysis(SAMSnapshotHelper.scala:40) at org.apache.spark.sql.catalyst.analysis.Analyzer.executeAndCheck(Analyzer.scala:678) at org.apache.spark.sql.execution.QueryExecution.$anonfun$lazyAnalyzed$3(QueryExecution.scala:487) at com.databricks.spark.util.FrameProfiler$.$anonfun$record$1(FrameProfiler.scala:114) at com.databricks.spark.util.FrameProfilerExporter$.maybeExportFrameProfiler(FrameProfilerExporter.scala:200) at com.databricks.spark.util.FrameProfiler$.record(FrameProfiler.scala:105) at org.apache.spark.sql.catalyst.QueryPlanningTracker.measurePhase(QueryPlanningTracker.scala:812) at org.apache.spark.sql.execution.QueryExecution.$anonfun$executePhase$8(QueryExecution.scala:1001) at org.apache.spark.sql.execution.SQLExecution$.$anonfun$withExecutionPhase$1(SQLExecution.scala:167) at com.databricks.logging.AttributionContext$.$anonfun$withValue$1(AttributionContext.scala:348) at scala.util.DynamicVariable.withValue(DynamicVariable.scala:59) at com.databricks.logging.AttributionContext$.withValue(AttributionContext.scala:344) at com.databricks.util.TracingSpanUtils$.$anonfun$withTracing$4(TracingSpanUtils.scala:246) at com.databricks.util.TracingSpanUtils$.withTracing(TracingSpanUtils.scala:130) at com.databricks.util.TracingSpanUtils$.withTracing(TracingSpanUtils.scala:244) at com.databricks.spark.util.DatabricksTracingHelper.withSpan(DatabricksSparkTracingHelper.scala:154) at com.databricks.spark.util.DBRTracing$.withSpan(DBRTracing.scala:87) at org.apache.spark.sql.execution.SQLExecution$.withExecutionPhase(SQLExecution.scala:148) at org.apache.spark.sql.execution.QueryExecution.$anonfun$executePhase$7(QueryExecution.scala:1001) at org.apache.spark.sql.execution.QueryExecution$.withInternalError(QueryExecution.scala:1660) at org.apache.spark.sql.execution.QueryExecution.$anonfun$executePhase$5(QueryExecution.scala:994) at com.databricks.util.LexicalThreadLocal$Handle.runWith(LexicalThreadLocal.scala:63) at org.apache.spark.sql.execution.QueryExecution.$anonfun$executePhase$4(QueryExecution.scala:991) at com.databricks.util.LexicalThreadLocal$Handle.runWith(LexicalThreadLocal.scala:63) at org.apache.spark.sql.execution.QueryExecution.$anonfun$executePhase$3(QueryExecution.scala:991) at com.databricks.util.LexicalThreadLocal$Handle.runWith(LexicalThreadLocal.scala:63) at org.apache.spark.sql.execution.QueryExecution.$anonfun$executePhase$2(QueryExecution.scala:990) at com.databricks.util.LexicalThreadLocal$Handle.runWith(LexicalThreadLocal.scala:63) at org.apache.spark.sql.execution.QueryExecution.withQueryExecutionId(QueryExecution.scala:978) at org.apache.spark.sql.execution.QueryExecution.$anonfun$executePhase$1(QueryExecution.scala:989) at org.apache.spark.sql.SparkSession.withActive(SparkSession.scala:860) at org.apache.spark.sql.execution.QueryExecution.executePhase(QueryExecution.scala:988) at org.apache.spark.sql.execution.QueryExecution.$anonfun$lazyAnalyzed$2(QueryExecution.scala:478) at com.databricks.sql.util.MemoryTrackerHelper.withMemoryTracking(MemoryTrackerHelper.scala:111) at org.apache.spark.sql.execution.QueryExecution.$anonfun$lazyAnalyzed$1(QueryExecution.scala:477) at scala.util.Try$.apply(Try.scala:217) at org.apache.spark.util.Utils$.doTryWithCallerStacktrace(Utils.scala:1685) at org.apache.spark.util.Utils$.getTryWithCallerStacktrace(Utils.scala:1746) at org.apache.spark.util.LazyTry.get(LazyTry.scala:75) at org.apache.spark.sql.execution.QueryExecution.analyzed(QueryExecution.scala:516) at org.mlflow.spark.autologging.DatasourceAttributeExtractorBase.getTableInfos(DatasourceAttributeExtractor.scala:88) at org.mlflow.spark.autologging.DatasourceAttributeExtractorBase.getTableInfos$(DatasourceAttributeExtractor.scala:85) at org.mlflow.spark.autologging.ReplAwareDatasourceAttributeExtractor$.getTableInfos(DatasourceAttributeExtractor.scala:144) at org.mlflow.spark.autologging.ReplAwareSparkDataSourceListener.onSQLExecutionEnd(ReplAwareSparkDataSourceListener.scala:49) at org.mlflow.spark.autologging.SparkDataSourceListener.$anonfun$onOtherEvent$1(SparkDataSourceListener.scala:39) at scala.runtime.java8.JFunction0$mcV$sp.apply(JFunction0$mcV$sp.scala:18) at org.mlflow.spark.autologging.ExceptionUtils$.tryAndLogUnexpectedError(ExceptionUtils.scala:26) at org.mlflow.spark.autologging.SparkDataSourceListener.onOtherEvent(SparkDataSourceListener.scala:39) at org.apache.spark.scheduler.SparkListenerBus.doPostEvent(SparkListenerBus.scala:108) at org.apache.spark.scheduler.SparkListenerBus.doPostEvent$(SparkListenerBus.scala:28) at org.apache.spark.scheduler.AsyncEventQueue.doPostEvent(AsyncEventQueue.scala:46) at org.apache.spark.scheduler.AsyncEventQueue.doPostEvent(AsyncEventQueue.scala:46) at org.apache.spark.util.ListenerBus.postToAll(ListenerBus.scala:208) at org.apache.spark.util.ListenerBus.postToAll$(ListenerBus.scala:172) at org.apache.spark.scheduler.AsyncEventQueue.super$postToAll(AsyncEventQueue.scala:183) at org.apache.spark.scheduler.AsyncEventQueue.$anonfun$dispatch$1(AsyncEventQueue.scala:183) at scala.runtime.java8.JFunction0$mcJ$sp.apply(JFunction0$mcJ$sp.scala:17) at scala.util.DynamicVariable.withValue(DynamicVariable.scala:59) at org.apache.spark.scheduler.AsyncEventQueue.org$apache$spark$scheduler$AsyncEventQueue$$dispatch(AsyncEventQueue.scala:119) at org.apache.spark.scheduler.AsyncEventQueue$$anon$2.$anonfun$run$1(AsyncEventQueue.scala:115) at org.apache.spark.util.Utils$.tryOrStopSparkContext(Utils.scala:1573) at org.apache.spark.scheduler.AsyncEventQueue$$anon$2.run(AsyncEventQueue.scala:115) Suppressed: org.apache.spark.util.Utils$OriginalTryStackTraceException: Full stacktrace of original doTryWithCallerStacktrace caller at oracle.jdbc.driver.T4CConnection.logon(T4CConnection.java:874) at oracle.jdbc.driver.PhysicalConnection.connect(PhysicalConnection.java:793) at oracle.jdbc.driver.T4CDriverExtension.getConnection(T4CDriverExtension.java:57) at oracle.jdbc.driver.OracleDriver.connect(OracleDriver.java:747) at oracle.jdbc.driver.OracleDriver.connect(OracleDriver.java:562) at org.apache.spark.sql.execution.datasources.jdbc.connection.BasicConnectionProvider.getConnection(BasicConnectionProvider.scala:50) at org.apache.spark.sql.execution.datasources.jdbc.connection.ConnectionProviderBase.create(ConnectionProvider.scala:102) at org.apache.spark.sql.jdbc.JdbcDialect.$anonfun$createConnectionFactory$2(JdbcDialects.scala:269) at org.apache.spark.sql.catalyst.MetricKeyUtils$.measure(MetricKey.scala:2259) at org.apache.spark.sql.jdbc.JdbcDialect.$anonfun$createConnectionFactory$1(JdbcDialects.scala:269) at org.apache.spark.sql.jdbc.JdbcDialect.$anonfun$createConnectionFactory$1$adapted(JdbcDialects.scala:265) at org.apache.spark.sql.execution.datasources.jdbc.JdbcUtils$.withConnection(JdbcUtils.scala:1383) at org.apache.spark.sql.execution.datasources.jdbc.JDBCRDD$.resolveTable(JDBCRDD.scala:90) at org.apache.spark.sql.execution.datasources.jdbc.JDBCRelation$.getSchema(JDBCRelation.scala:256) at org.apache.spark.sql.execution.datasources.jdbc.JdbcRelationProvider.$anonfun$createRelation$1(JdbcRelationProvider.scala:50) at org.apache.spark.sql.execution.metric.SQLMetrics$.withTimingNs(SQLMetrics.scala:493) at org.apache.spark.sql.execution.datasources.jdbc.JdbcRelationProvider.createRelation(JdbcRelationProvider.scala:50) at org.apache.spark.sql.execution.datasources.DataSource.resolveRelation(DataSource.scala:455) at org.apache.spark.sql.catalyst.analysis.ResolveDataSource.org$apache$spark$sql$catalyst$analysis$ResolveDataSource$$loadV1BatchSource(ResolveDataSource.scala:277) at org.apache.spark.sql.catalyst.analysis.ResolveDataSource$$anonfun$apply$1.$anonfun$applyOrElse$5(ResolveDataSource.scala:114) at scala.Option.getOrElse(Option.scala:201) at org.apache.spark.sql.catalyst.analysis.ResolveDataSource$$anonfun$apply$1.applyOrElse(ResolveDataSource.scala:113) at org.apache.spark.sql.catalyst.analysis.ResolveDataSource$$anonfun$apply$1.applyOrElse(ResolveDataSource.scala:60) at org.apache.spark.sql.catalyst.plans.logical.AnalysisHelper.$anonfun$resolveOperatorsUpWithPruning$3(AnalysisHelper.scala:141) at org.apache.spark.sql.catalyst.trees.CurrentOrigin$.withOrigin(origin.scala:142) at org.apache.spark.sql.catalyst.plans.logical.AnalysisHelper.$anonfun$resolveOperatorsUpWithPruning$1(AnalysisHelper.scala:141) at org.apache.spark.sql.catalyst.plans.logical.AnalysisHelper$.allowInvokingTransformsInAnalyzer(AnalysisHelper.scala:418) at org.apache.spark.sql.catalyst.plans.logical.AnalysisHelper.resolveOperatorsUpWithPruning(AnalysisHelper.scala:137) at org.apache.spark.sql.catalyst.plans.logical.AnalysisHelper.resolveOperatorsUpWithPruning$(AnalysisHelper.scala:133) at org.apache.spark.sql.catalyst.plans.logical.LogicalPlan.resolveOperatorsUpWithPruning(LogicalPlan.scala:47) at org.apache.spark.sql.catalyst.plans.logical.AnalysisHelper.resolveOperatorsUp(AnalysisHelper.scala:114) at org.apache.spark.sql.catalyst.plans.logical.AnalysisHelper.resolveOperatorsUp$(AnalysisHelper.scala:113) at org.apache.spark.sql.catalyst.plans.logical.LogicalPlan.resolveOperatorsUp(LogicalPlan.scala:47) at org.apache.spark.sql.catalyst.analysis.ResolveDataSource.apply(ResolveDataSource.scala:60) at org.apache.spark.sql.catalyst.analysis.ResolveDataSource.apply(ResolveDataSource.scala:58) at org.apache.spark.sql.catalyst.rules.RuleExecutor.$anonfun$execute$17(RuleExecutor.scala:509) at org.apache.spark.sql.catalyst.rules.RecoverableRuleExecutionHelper.processRule(RuleExecutor.scala:663) at org.apache.spark.sql.catalyst.rules.RecoverableRuleExecutionHelper.processRule$(RuleExecutor.scala:647) at org.apache.spark.sql.catalyst.rules.RuleExecutor.processRule(RuleExecutor.scala:143) at org.apache.spark.sql.catalyst.rules.RuleExecutor.$anonfun$execute$16(RuleExecutor.scala:509) at com.databricks.spark.util.MemoryTracker$.withThreadAllocatedBytes(MemoryTracker.scala:51) at org.apache.spark.sql.catalyst.QueryPlanningTracker$.measureRule(QueryPlanningTracker.scala:382) at org.apache.spark.sql.catalyst.rules.RuleExecutor.$anonfun$execute$15(RuleExecutor.scala:507) at com.databricks.spark.util.FrameProfiler$.$anonfun$record$1(FrameProfiler.scala:114) at com.databricks.spark.util.FrameProfilerExporter$.maybeExportFrameProfiler(FrameProfilerExporter.scala:200) at com.databricks.spark.util.FrameProfiler$.record(FrameProfiler.scala:105) at org.apache.spark.sql.catalyst.rules.RuleExecutor.$anonfun$execute$14(RuleExecutor.scala:506) at scala.collection.LinearSeqOps.foldLeft(LinearSeq.scala:183) at scala.collection.LinearSeqOps.foldLeft$(LinearSeq.scala:179) at scala.collection.immutable.List.foldLeft(List.scala:79) at org.apache.spark.sql.catalyst.rules.RuleExecutor.$anonfun$execute$13(RuleExecutor.scala:498) at scala.runtime.java8.JFunction0$mcV$sp.apply(JFunction0$mcV$sp.scala:18) at com.databricks.spark.util.FrameProfiler$.$anonfun$record$1(FrameProfiler.scala:114) at com.databricks.spark.util.FrameProfilerExporter$.maybeExportFrameProfiler(FrameProfilerExporter.scala:200) at com.databricks.spark.util.FrameProfiler$.record(FrameProfiler.scala:105) at org.apache.spark.sql.catalyst.rules.RuleExecutor.executeBatch$1(RuleExecutor.scala:472) at org.apache.spark.sql.catalyst.rules.RuleExecutor.$anonfun$execute$23(RuleExecutor.scala:619) at org.apache.spark.sql.catalyst.rules.RuleExecutor.$anonfun$execute$23$adapted(RuleExecutor.scala:619) at scala.collection.immutable.List.foreach(List.scala:334) at org.apache.spark.sql.catalyst.rules.RuleExecutor.$anonfun$execute$1(RuleExecutor.scala:619) at com.databricks.spark.util.FrameProfiler$.$anonfun$record$1(FrameProfiler.scala:114) at com.databricks.spark.util.FrameProfilerExporter$.maybeExportFrameProfiler(FrameProfilerExporter.scala:200) at com.databricks.spark.util.FrameProfiler$.record(FrameProfiler.scala:105) at org.apache.spark.sql.catalyst.rules.RuleExecutor.execute(RuleExecutor.scala:365) at org.apache.spark.sql.catalyst.analysis.Analyzer.super$execute(Analyzer.scala:734) at org.apache.spark.sql.catalyst.analysis.Analyzer.$anonfun$executeSameContext$1(Analyzer.scala:734) at com.databricks.sql.unity.SAMSnapshotHelper$.visitPlansDuringAnalysis(SAMSnapshotHelper.scala:40) at org.apache.spark.sql.catalyst.analysis.Analyzer.executeSameContext(Analyzer.scala:733) at org.apache.spark.sql.catalyst.analysis.Analyzer.$anonfun$execute$1(Analyzer.scala:706) at org.apache.spark.sql.catalyst.analysis.AnalysisContext$.withNewAnalysisContext(Analyzer.scala:477) at org.apache.spark.sql.catalyst.analysis.Analyzer.execute(Analyzer.scala:706) at org.apache.spark.sql.catalyst.analysis.Analyzer.execute(Analyzer.scala:629) at org.apache.spark.sql.catalyst.rules.RuleExecutor.$anonfun$executeAndTrack$1(RuleExecutor.scala:353) at org.apache.spark.sql.catalyst.QueryPlanningTracker$.withTracker(QueryPlanningTracker.scala:265) at org.apache.spark.sql.catalyst.rules.RuleExecutor.executeAndTrack(RuleExecutor.scala:353) at org.apache.spark.sql.catalyst.analysis.resolver.HybridAnalyzer.resolveInFixedPoint(HybridAnalyzer.scala:438) at org.apache.spark.sql.catalyst.analysis.resolver.HybridAnalyzer.$anonfun$apply$1(HybridAnalyzer.scala:97) at org.apache.spark.sql.catalyst.analysis.resolver.HybridAnalyzer.withTrackedAnalyzerBridgeState(HybridAnalyzer.scala:134) at org.apache.spark.sql.catalyst.analysis.resolver.HybridAnalyzer.apply(HybridAnalyzer.scala:90) at org.apache.spark.sql.catalyst.analysis.Analyzer.$anonfun$executeAndCheck$2(Analyzer.scala:686) at org.apache.spark.sql.catalyst.plans.logical.AnalysisHelper$.markInAnalyzer(AnalysisHelper.scala:425) at org.apache.spark.sql.catalyst.analysis.Analyzer.$anonfun$executeAndCheck$1(Analyzer.scala:686) at com.databricks.sql.unity.SAMSnapshotHelper$.visitPlansDuringAnalysis(SAMSnapshotHelper.scala:40) at org.apache.spark.sql.catalyst.analysis.Analyzer.executeAndCheck(Analyzer.scala:678) at org.apache.spark.sql.execution.QueryExecution.$anonfun$lazyAnalyzed$3(QueryExecution.scala:487) at com.databricks.spark.util.FrameProfiler$.$anonfun$record$1(FrameProfiler.scala:114) at com.databricks.spark.util.FrameProfilerExporter$.maybeExportFrameProfiler(FrameProfilerExporter.scala:200) at com.databricks.spark.util.FrameProfiler$.record(FrameProfiler.scala:105) at org.apache.spark.sql.catalyst.QueryPlanningTracker.measurePhase(QueryPlanningTracker.scala:812) at org.apache.spark.sql.execution.QueryExecution.$anonfun$executePhase$8(QueryExecution.scala:1001) at org.apache.spark.sql.execution.SQLExecution$.$anonfun$withExecutionPhase$1(SQLExecution.scala:167) at com.databricks.logging.AttributionContext$.$anonfun$withValue$1(AttributionContext.scala:348) at scala.util.DynamicVariable.withValue(DynamicVariable.scala:59) at com.databricks.logging.AttributionContext$.withValue(AttributionContext.scala:344) at com.databricks.util.TracingSpanUtils$.$anonfun$withTracing$4(TracingSpanUtils.scala:246) at com.databricks.util.TracingSpanUtils$.withTracing(TracingSpanUtils.scala:130) at com.databricks.util.TracingSpanUtils$.withTracing(TracingSpanUtils.scala:244) at com.databricks.spark.util.DatabricksTracingHelper.withSpan(DatabricksSparkTracingHelper.scala:154) at com.databricks.spark.util.DBRTracing$.withSpan(DBRTracing.scala:87) at org.apache.spark.sql.execution.SQLExecution$.withExecutionPhase(SQLExecution.scala:148) at org.apache.spark.sql.execution.QueryExecution.$anonfun$executePhase$7(QueryExecution.scala:1001) at org.apache.spark.sql.execution.QueryExecution$.withInternalError(QueryExecution.scala:1660) at org.apache.spark.sql.execution.QueryExecution.$anonfun$executePhase$5(QueryExecution.scala:994) at com.databricks.util.LexicalThreadLocal$Handle.runWith(LexicalThreadLocal.scala:63) at org.apache.spark.sql.execution.QueryExecution.$anonfun$executePhase$4(QueryExecution.scala:991) at com.databricks.util.LexicalThreadLocal$Handle.runWith(LexicalThreadLocal.scala:63) at org.apache.spark.sql.execution.QueryExecution.$anonfun$executePhase$3(QueryExecution.scala:991) at com.databricks.util.LexicalThreadLocal$Handle.runWith(LexicalThreadLocal.scala:63) at org.apache.spark.sql.execution.QueryExecution.$anonfun$executePhase$2(QueryExecution.scala:990) at com.databricks.util.LexicalThreadLocal$Handle.runWith(LexicalThreadLocal.scala:63) at org.apache.spark.sql.execution.QueryExecution.withQueryExecutionId(QueryExecution.scala:978) at org.apache.spark.sql.execution.QueryExecution.$anonfun$executePhase$1(QueryExecution.scala:989) at org.apache.spark.sql.SparkSession.withActive(SparkSession.scala:860) at org.apache.spark.sql.execution.QueryExecution.executePhase(QueryExecution.scala:988) at org.apache.spark.sql.execution.QueryExecution.$anonfun$lazyAnalyzed$2(QueryExecution.scala:478) at com.databricks.sql.util.MemoryTrackerHelper.withMemoryTracking(MemoryTrackerHelper.scala:111) at org.apache.spark.sql.execution.QueryExecution.$anonfun$lazyAnalyzed$1(QueryExecution.scala:477) at scala.util.Try$.apply(Try.scala:217) at org.apache.spark.util.Utils$.doTryWithCallerStacktrace(Utils.scala:1685) at org.apache.spark.util.LazyTry.tryT$lzycompute(LazyTry.scala:60) at org.apache.spark.util.LazyTry.tryT(LazyTry.scala:59) at org.apache.spark.util.LazyTry.get(LazyTry.scala:75) at org.apache.spark.sql.execution.QueryExecution.analyzed(QueryExecution.scala:516) at org.apache.spark.sql.execution.QueryExecution.assertAnalyzed(QueryExecution.scala:404) at org.apache.spark.sql.classic.Dataset$.$anonfun$ofRows$1(Dataset.scala:128) at org.apache.spark.sql.SparkSession.withActive(SparkSession.scala:860) at org.apache.spark.sql.classic.SparkSession.$anonfun$withActiveAndFrameProfiler$1(SparkSession.scala:1245) at com.databricks.spark.util.FrameProfiler$.$anonfun$record$1(FrameProfiler.scala:114) at com.databricks.spark.util.FrameProfilerExporter$.maybeExportFrameProfiler(FrameProfilerExporter.scala:200) at com.databricks.spark.util.FrameProfiler$.record(FrameProfiler.scala:105) at org.apache.spark.sql.classic.SparkSession.withActiveAndFrameProfiler(SparkSession.scala:1245) at org.apache.spark.sql.classic.Dataset$.ofRows(Dataset.scala:126) at org.apache.spark.sql.classic.DataFrameReader.load(DataFrameReader.scala:164) at org.apache.spark.sql.classic.DataFrameReader.load(DataFrameReader.scala:140) at java.base/jdk.internal.reflect.NativeMethodAccessorImpl.invoke0(Native Method) at java.base/jdk.internal.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:75) at java.base/jdk.internal.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:52) at java.base/java.lang.reflect.Method.invoke(Method.java:580) at py4j.reflection.MethodInvoker.invoke(MethodInvoker.java:244) at py4j.reflection.ReflectionEngine.invoke(ReflectionEngine.java:397) at py4j.Gateway.invoke(Gateway.java:306) at py4j.commands.AbstractCommand.invokeMethod(AbstractCommand.java:132) at py4j.commands.CallCommand.execute(CallCommand.java:79) at py4j.ClientServerConnection.waitForCommands(ClientServerConnection.java:197) at py4j.ClientServerConnection.run(ClientServerConnection.java:117) at java.base/java.lang.Thread.run(Thread.java:1583) Caused by: java.io.IOException: Socket read timed out, Authentication lapse 0 ms. at oracle.jdbc.driver.T4CConnection.logon(T4CConnection.java:870) at oracle.jdbc.driver.PhysicalConnection.connect(PhysicalConnection.java:793) at oracle.jdbc.driver.T4CDriverExtension.getConnection(T4CDriverExtension.java:57) at oracle.jdbc.driver.OracleDriver.connect(OracleDriver.java:747) at oracle.jdbc.driver.OracleDriver.connect(OracleDriver.java:562) at org.apache.spark.sql.execution.datasources.jdbc.connection.BasicConnectionProvider.getConnection(BasicConnectionProvider.scala:50) at org.apache.spark.sql.execution.datasources.jdbc.connection.ConnectionProviderBase.create(ConnectionProvider.scala:102) at org.apache.spark.sql.jdbc.JdbcDialect.$anonfun$createConnectionFactory$2(JdbcDialects.scala:269) at org.apache.spark.sql.catalyst.MetricKeyUtils$.measure(MetricKey.scala:2259) at org.apache.spark.sql.jdbc.JdbcDialect.$anonfun$createConnectionFactory$1(JdbcDialects.scala:269) at org.apache.spark.sql.jdbc.JdbcDialect.$anonfun$createConnectionFactory$1$adapted(JdbcDialects.scala:265) at org.apache.spark.sql.execution.datasources.jdbc.JdbcUtils$.withConnection(JdbcUtils.scala:1383) at org.apache.spark.sql.execution.datasources.jdbc.JDBCRDD$.resolveTable(JDBCRDD.scala:90) at org.apache.spark.sql.execution.datasources.jdbc.JDBCRelation$.getSchema(JDBCRelation.scala:256) at org.apache.spark.sql.execution.datasources.jdbc.JdbcRelationProvider.$anonfun$createRelation$1(JdbcRelationProvider.scala:50) at org.apache.spark.sql.execution.metric.SQLMetrics$.withTimingNs(SQLMetrics.scala:493) at org.apache.spark.sql.execution.datasources.jdbc.JdbcRelationProvider.createRelation(JdbcRelationProvider.scala:50) at org.apache.spark.sql.execution.datasources.DataSource.resolveRelation(DataSource.scala:455) at org.apache.spark.sql.catalyst.analysis.ResolveDataSource.org$apache$spark$sql$catalyst$analysis$ResolveDataSource$$loadV1BatchSource(ResolveDataSource.scala:277) at org.apache.spark.sql.catalyst.analysis.ResolveDataSource$$anonfun$apply$1.$anonfun$applyOrElse$5(ResolveDataSource.scala:114) at scala.Option.getOrElse(Option.scala:201) at org.apache.spark.sql.catalyst.analysis.ResolveDataSource$$anonfun$apply$1.applyOrElse(ResolveDataSource.scala:113) at org.apache.spark.sql.catalyst.analysis.ResolveDataSource$$anonfun$apply$1.applyOrElse(ResolveDataSource.scala:60) at org.apache.spark.sql.catalyst.plans.logical.AnalysisHelper.$anonfun$resolveOperatorsUpWithPruning$3(AnalysisHelper.scala:141) at org.apache.spark.sql.catalyst.trees.CurrentOrigin$.withOrigin(origin.scala:142) at org.apache.spark.sql.catalyst.plans.logical.AnalysisHelper.$anonfun$resolveOperatorsUpWithPruning$1(AnalysisHelper.scala:141) at org.apache.spark.sql.catalyst.plans.logical.AnalysisHelper$.allowInvokingTransformsInAnalyzer(AnalysisHelper.scala:418) at org.apache.spark.sql.catalyst.plans.logical.AnalysisHelper.resolveOperatorsUpWithPruning(AnalysisHelper.scala:137) at org.apache.spark.sql.catalyst.plans.logical.AnalysisHelper.resolveOperatorsUpWithPruning$(AnalysisHelper.scala:133) at org.apache.spark.sql.catalyst.plans.logical.LogicalPlan.resolveOperatorsUpWithPruning(LogicalPlan.scala:47) at org.apache.spark.sql.catalyst.plans.logical.AnalysisHelper.resolveOperatorsUp(AnalysisHelper.scala:114) at org.apache.spark.sql.catalyst.plans.logical.AnalysisHelper.resolveOperatorsUp$(AnalysisHelper.scala:113) at org.apache.spark.sql.catalyst.plans.logical.LogicalPlan.resolveOperatorsUp(LogicalPlan.scala:47) at org.apache.spark.sql.catalyst.analysis.ResolveDataSource.apply(ResolveDataSource.scala:60) at org.apache.spark.sql.catalyst.analysis.ResolveDataSource.apply(ResolveDataSource.scala:58) at org.apache.spark.sql.catalyst.rules.RuleExecutor.$anonfun$execute$17(RuleExecutor.scala:509) at org.apache.spark.sql.catalyst.rules.RecoverableRuleExecutionHelper.processRule(RuleExecutor.scala:663) at org.apache.spark.sql.catalyst.rules.RecoverableRuleExecutionHelper.processRule$(RuleExecutor.scala:647) at org.apache.spark.sql.catalyst.rules.RuleExecutor.processRule(RuleExecutor.scala:143) at org.apache.spark.sql.catalyst.rules.RuleExecutor.$anonfun$execute$16(RuleExecutor.scala:509) at com.databricks.spark.util.MemoryTracker$.withThreadAllocatedBytes(MemoryTracker.scala:51) at org.apache.spark.sql.catalyst.QueryPlanningTracker$.measureRule(QueryPlanningTracker.scala:382) at org.apache.spark.sql.catalyst.rules.RuleExecutor.$anonfun$execute$15(RuleExecutor.scala:507) at com.databricks.spark.util.FrameProfiler$.$anonfun$record$1(FrameProfiler.scala:114) at com.databricks.spark.util.FrameProfilerExporter$.maybeExportFrameProfiler(FrameProfilerExporter.scala:200) at com.databricks.spark.util.FrameProfiler$.record(FrameProfiler.scala:105) at org.apache.spark.sql.catalyst.rules.RuleExecutor.$anonfun$execute$14(RuleExecutor.scala:506) at scala.collection.LinearSeqOps.foldLeft(LinearSeq.scala:183) at scala.collection.LinearSeqOps.foldLeft$(LinearSeq.scala:179) at scala.collection.immutable.List.foldLeft(List.scala:79) at org.apache.spark.sql.catalyst.rules.RuleExecutor.$anonfun$execute$13(RuleExecutor.scala:498) at scala.runtime.java8.JFunction0$mcV$sp.apply(JFunction0$mcV$sp.scala:18) at com.databricks.spark.util.FrameProfiler$.$anonfun$record$1(FrameProfiler.scala:114) at com.databricks.spark.util.FrameProfilerExporter$.maybeExportFrameProfiler(FrameProfilerExporter.scala:200) at com.databricks.spark.util.FrameProfiler$.record(FrameProfiler.scala:105) at org.apache.spark.sql.catalyst.rules.RuleExecutor.executeBatch$1(RuleExecutor.scala:472) at org.apache.spark.sql.catalyst.rules.RuleExecutor.$anonfun$execute$23(RuleExecutor.scala:619) at org.apache.spark.sql.catalyst.rules.RuleExecutor.$anonfun$execute$23$adapted(RuleExecutor.scala:619) at scala.collection.immutable.List.foreach(List.scala:334) at org.apache.spark.sql.catalyst.rules.RuleExecutor.$anonfun$execute$1(RuleExecutor.scala:619) at com.databricks.spark.util.FrameProfiler$.$anonfun$record$1(FrameProfiler.scala:114) at com.databricks.spark.util.FrameProfilerExporter$.maybeExportFrameProfiler(FrameProfilerExporter.scala:200) at com.databricks.spark.util.FrameProfiler$.record(FrameProfiler.scala:105) at org.apache.spark.sql.catalyst.rules.RuleExecutor.execute(RuleExecutor.scala:365) at org.apache.spark.sql.catalyst.analysis.Analyzer.super$execute(Analyzer.scala:734) at org.apache.spark.sql.catalyst.analysis.Analyzer.$anonfun$executeSameContext$1(Analyzer.scala:734) at com.databricks.sql.unity.SAMSnapshotHelper$.visitPlansDuringAnalysis(SAMSnapshotHelper.scala:40) at org.apache.spark.sql.catalyst.analysis.Analyzer.executeSameContext(Analyzer.scala:733) at org.apache.spark.sql.catalyst.analysis.Analyzer.$anonfun$execute$1(Analyzer.scala:706) at org.apache.spark.sql.catalyst.analysis.AnalysisContext$.withNewAnalysisContext(Analyzer.scala:477) at org.apache.spark.sql.catalyst.analysis.Analyzer.execute(Analyzer.scala:706) at org.apache.spark.sql.catalyst.analysis.Analyzer.execute(Analyzer.scala:629) at org.apache.spark.sql.catalyst.rules.RuleExecutor.$anonfun$executeAndTrack$1(RuleExecutor.scala:353) at org.apache.spark.sql.catalyst.QueryPlanningTracker$.withTracker(QueryPlanningTracker.scala:265) at org.apache.spark.sql.catalyst.rules.RuleExecutor.executeAndTrack(RuleExecutor.scala:353) at org.apache.spark.sql.catalyst.analysis.resolver.HybridAnalyzer.resolveInFixedPoint(HybridAnalyzer.scala:438) at org.apache.spark.sql.catalyst.analysis.resolver.HybridAnalyzer.$anonfun$apply$1(HybridAnalyzer.scala:97) at org.apache.spark.sql.catalyst.analysis.resolver.HybridAnalyzer.withTrackedAnalyzerBridgeState(HybridAnalyzer.scala:134) at org.apache.spark.sql.catalyst.analysis.resolver.HybridAnalyzer.apply(HybridAnalyzer.scala:90) at org.apache.spark.sql.catalyst.analysis.Analyzer.$anonfun$executeAndCheck$2(Analyzer.scala:686) at org.apache.spark.sql.catalyst.plans.logical.AnalysisHelper$.markInAnalyzer(AnalysisHelper.scala:425) at org.apache.spark.sql.catalyst.analysis.Analyzer.$anonfun$executeAndCheck$1(Analyzer.scala:686) at com.databricks.sql.unity.SAMSnapshotHelper$.visitPlansDuringAnalysis(SAMSnapshotHelper.scala:40) at org.apache.spark.sql.catalyst.analysis.Analyzer.executeAndCheck(Analyzer.scala:678) at org.apache.spark.sql.execution.QueryExecution.$anonfun$lazyAnalyzed$3(QueryExecution.scala:487) at com.databricks.spark.util.FrameProfiler$.$anonfun$record$1(FrameProfiler.scala:114) at com.databricks.spark.util.FrameProfilerExporter$.maybeExportFrameProfiler(FrameProfilerExporter.scala:200) at com.databricks.spark.util.FrameProfiler$.record(FrameProfiler.scala:105) at org.apache.spark.sql.catalyst.QueryPlanningTracker.measurePhase(QueryPlanningTracker.scala:812) at org.apache.spark.sql.execution.QueryExecution.$anonfun$executePhase$8(QueryExecution.scala:1001) at org.apache.spark.sql.execution.SQLExecution$.$anonfun$withExecutionPhase$1(SQLExecution.scala:167) at com.databricks.logging.AttributionContext$.$anonfun$withValue$1(AttributionContext.scala:348) at scala.util.DynamicVariable.withValue(DynamicVariable.scala:59) at com.databricks.logging.AttributionContext$.withValue(AttributionContext.scala:344) at com.databricks.util.TracingSpanUtils$.$anonfun$withTracing$4(TracingSpanUtils.scala:246) at com.databricks.util.TracingSpanUtils$.withTracing(TracingSpanUtils.scala:130) at com.databricks.util.TracingSpanUtils$.withTracing(TracingSpanUtils.scala:244) at com.databricks.spark.util.DatabricksTracingHelper.withSpan(DatabricksSparkTracingHelper.scala:154) at com.databricks.spark.util.DBRTracing$.withSpan(DBRTracing.scala:87) at org.apache.spark.sql.execution.SQLExecution$.withExecutionPhase(SQLExecution.scala:148) at org.apache.spark.sql.execution.QueryExecution.$anonfun$executePhase$7(QueryExecution.scala:1001) at org.apache.spark.sql.execution.QueryExecution$.withInternalError(QueryExecution.scala:1660) at org.apache.spark.sql.execution.QueryExecution.$anonfun$executePhase$5(QueryExecution.scala:994) at com.databricks.util.LexicalThreadLocal$Handle.runWith(LexicalThreadLocal.scala:63) at org.apache.spark.sql.execution.QueryExecution.$anonfun$executePhase$4(QueryExecution.scala:991) at com.databricks.util.LexicalThreadLocal$Handle.runWith(LexicalThreadLocal.scala:63) at org.apache.spark.sql.execution.QueryExecution.$anonfun$executePhase$3(QueryExecution.scala:991) at com.databricks.util.LexicalThreadLocal$Handle.runWith(LexicalThreadLocal.scala:63) at org.apache.spark.sql.execution.QueryExecution.$anonfun$executePhase$2(QueryExecution.scala:990) at com.databricks.util.LexicalThreadLocal$Handle.runWith(LexicalThreadLocal.scala:63) at org.apache.spark.sql.execution.QueryExecution.withQueryExecutionId(QueryExecution.scala:978) at org.apache.spark.sql.execution.QueryExecution.$anonfun$executePhase$1(QueryExecution.scala:989) at org.apache.spark.sql.SparkSession.withActive(SparkSession.scala:860) at org.apache.spark.sql.execution.QueryExecution.executePhase(QueryExecution.scala:988) at org.apache.spark.sql.execution.QueryExecution.$anonfun$lazyAnalyzed$2(QueryExecution.scala:478) at com.databricks.sql.util.MemoryTrackerHelper.withMemoryTracking(MemoryTrackerHelper.scala:111) at org.apache.spark.sql.execution.QueryExecution.$anonfun$lazyAnalyzed$1(QueryExecution.scala:477) at scala.util.Try$.apply(Try.scala:217) at org.apache.spark.util.Utils$.doTryWithCallerStacktrace(Utils.scala:1685) at org.apache.spark.util.LazyTry.tryT$lzycompute(LazyTry.scala:60) at org.apache.spark.util.LazyTry.tryT(LazyTry.scala:59) at org.apache.spark.util.LazyTry.get(LazyTry.scala:75) at org.apache.spark.sql.execution.QueryExecution.analyzed(QueryExecution.scala:516) at org.apache.spark.sql.execution.QueryExecution.assertAnalyzed(QueryExecution.scala:404) at org.apache.spark.sql.classic.Dataset$.$anonfun$ofRows$1(Dataset.scala:128) at org.apache.spark.sql.SparkSession.withActive(SparkSession.scala:860) at org.apache.spark.sql.classic.SparkSession.$anonfun$withActiveAndFrameProfiler$1(SparkSession.scala:1245) at com.databricks.spark.util.FrameProfiler$.$anonfun$record$1(FrameProfiler.scala:114) at com.databricks.spark.util.FrameProfilerExporter$.maybeExportFrameProfiler(FrameProfilerExporter.scala:200) at com.databricks.spark.util.FrameProfiler$.record(FrameProfiler.scala:105) at org.apache.spark.sql.classic.SparkSession.withActiveAndFrameProfiler(SparkSession.scala:1245) at org.apache.spark.sql.classic.Dataset$.ofRows(Dataset.scala:126) at org.apache.spark.sql.classic.DataFrameReader.load(DataFrameReader.scala:164) at org.apache.spark.sql.classic.DataFrameReader.load(DataFrameReader.scala:140) at java.base/jdk.internal.reflect.NativeMethodAccessorImpl.invoke0(Native Method) at java.base/jdk.internal.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:75) at java.base/jdk.internal.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:52) at java.base/java.lang.reflect.Method.invoke(Method.java:580) at py4j.reflection.MethodInvoker.invoke(MethodInvoker.java:244) at py4j.reflection.ReflectionEngine.invoke(ReflectionEngine.java:397) at py4j.Gateway.invoke(Gateway.java:306) at py4j.commands.AbstractCommand.invokeMethod(AbstractCommand.java:132) at py4j.commands.CallCommand.execute(CallCommand.java:79) at py4j.ClientServerConnection.waitForCommands(ClientServerConnection.java:197) at py4j.ClientServerConnection.run(ClientServerConnection.java:117) at java.base/java.lang.Thread.run(Thread.java:1583) Caused by: oracle.net.nt.TimeoutInterruptHandler$IOReadTimeoutException: Socket read timed out at oracle.net.nt.TimeoutSocketChannel.handleInterrupt(TimeoutSocketChannel.java:254) at oracle.net.nt.TimeoutSocketChannel.read(TimeoutSocketChannel.java:180) at oracle.net.ns.NSProtocolNIO.doSocketRead(NSProtocolNIO.java:555) at oracle.net.ns.NIOPacket.readNIOPacket(NIOPacket.java:403) at oracle.net.ns.NSProtocolNIO.negotiateConnection(NSProtocolNIO.java:127) at oracle.net.ns.NSProtocol.connect(NSProtocol.java:340) at oracle.jdbc.driver.T4CConnection.connect(T4CConnection.java:1596) at oracle.jdbc.driver.T4CConnection.logon(T4CConnection.java:588) ... 145 more

SteveOstrowski
Databricks Employee
Databricks Employee

Hi @luketl2,

Based on your description and the stack trace you shared, the root cause is almost certainly an NNE (Native Network Encryption) algorithm mismatch between the Oracle JDBC thin driver that Databricks uses and your Oracle RDS server-side configuration. The key clue is this line in your stack trace:

Authentication lapse 0 ms

This means the TCP handshake completed successfully (which you already confirmed with nc -vz), the JDBC driver initiated the Oracle Net session, but encryption/integrity algorithm negotiation failed during authentication. Oracle then drops the socket before the authentication exchange finishes.

Here is a systematic approach to diagnose and resolve this.


STEP 1: VERIFY THE SERVER-SIDE NNE ALGORITHM LIST

On your Oracle RDS instance, check which encryption and integrity (checksum) algorithms are configured. You can query this from a privileged session:

SELECT name, value
FROM v$parameter
WHERE name IN (
'sqlnet.encryption_types_server',
'sqlnet.encryption_server',
'sqlnet.crypto_checksum_types_server',
'sqlnet.crypto_checksum_server'
);

Or if you have access to the RDS parameter group, look for these parameters:

sqlnet.encryption_server = REQUESTED (you confirmed this)
sqlnet.encryption_types_server = (list of algorithms)
sqlnet.crypto_checksum_server = REQUESTED or ACCEPTED
sqlnet.crypto_checksum_types_server = (list of algorithms)

The Databricks JDBC thin driver supports a specific set of NNE algorithms. If your server is configured with algorithms the thin driver does not support (for example, older algorithms like DES or 3DES112 only, or newer ones not yet in the driver), negotiation will fail silently and Oracle closes the socket.


STEP 2: ALIGN ENCRYPTION ALGORITHMS

The Oracle JDBC thin driver typically supports these NNE encryption algorithms:

AES256
AES192
AES128
3DES168

And these integrity/checksum algorithms:

SHA256
SHA384
SHA512
SHA1

Make sure your RDS parameter group includes at least one algorithm from each list above. For example, in your RDS custom parameter group, set:

sqlnet.encryption_types_server = AES256,AES192,AES128
sqlnet.crypto_checksum_types_server = SHA256,SHA1

After changing RDS parameter group values, you may need to reboot the RDS instance for the changes to take effect (depending on whether the parameter is static or dynamic).


STEP 3: CHECK THE CRYPTO_CHECKSUM SETTING

A commonly overlooked cause of "socket closed" during NNE negotiation is the integrity/checksum configuration. Even if encryption negotiation succeeds, if the checksum algorithms do not match, Oracle will close the connection. Make sure:

sqlnet.crypto_checksum_server = REQUESTED (or ACCEPTED)
sqlnet.crypto_checksum_types_server includes at least SHA256 or SHA1


STEP 4: TEST WITH EXPLICIT JDBC PROPERTIES (OPTIONAL DIAGNOSTIC)

Since you already tried using the ojdbc driver from Maven directly, you can further diagnose by setting explicit NNE properties on the JDBC connection. In a notebook on your cluster, try:

jdbc_url = "jdbc:oracle:thin:@//YOUR_HOST:1521/YOUR_SERVICE_NAME"

connection_properties = {
"user": "YOUR_USER",
"password": "YOUR_PASSWORD",
"oracle.net.encryption_client": "REQUESTED",
"oracle.net.encryption_types_client": "(AES256)",
"oracle.net.crypto_checksum_client": "REQUESTED",
"oracle.net.crypto_checksum_types_client": "(SHA256)",
"oracle.jdbc.timezoneAsRegion": "false"
}

df = spark.read.format("jdbc") \
.option("url", jdbc_url) \
.option("dbtable", "(SELECT 1 FROM DUAL)") \
.options(**connection_properties) \
.load()

df.show()

If this works, the issue is confirmed as an algorithm mismatch. If it still fails, try changing the encryption_types_client and crypto_checksum_types_client values to match what your server supports.


STEP 5: CHECK FOR ORACLE RDS-SPECIFIC CONSIDERATIONS

Amazon RDS for Oracle has specific behavior around NNE:

1. NNE configuration is done through RDS option groups (not sqlnet.ora directly). Make sure the NATIVE_NETWORK_ENCRYPTION option is added to your RDS option group with compatible algorithms.

2. In the RDS option group, look for these settings under NATIVE_NETWORK_ENCRYPTION:
- SQLNET.ENCRYPTION_SERVER
- SQLNET.ENCRYPTION_TYPES_SERVER
- SQLNET.CRYPTO_CHECKSUM_SERVER
- SQLNET.CRYPTO_CHECKSUM_TYPES_SERVER

3. RDS documentation for NNE configuration:
https://docs.aws.amazon.com/AmazonRDS/latest/UserGuide/Appendix.Oracle.Options.NetworkEncryption.htm...


STEP 6: ENABLE JDBC DRIVER TRACING (ADDITIONAL DIAGNOSTICS)

For more detail on what is happening during the connection attempt, you can enable Oracle JDBC thin driver tracing on your Databricks cluster. Add these Spark config properties to your cluster configuration:

spark.driver.extraJavaOptions -Doracle.jdbc.Trace=true -Djava.util.logging.config.file=/tmp/ojdbc_logging.properties

Create the logging properties file in an init script:

cat > /tmp/ojdbc_logging.properties << 'EOF'
.level=FINE
handlers=java.util.logging.FileHandler
java.util.logging.FileHandler.pattern=/tmp/ojdbc_trace.log
java.util.logging.FileHandler.limit=50000000
java.util.logging.FileHandler.count=1
java.util.logging.FileHandler.formatter=java.util.logging.SimpleFormatter
EOF

After the failed connection attempt, check /tmp/ojdbc_trace.log on the driver for detailed NNE negotiation messages.


ADDITIONAL NOTES

- The Databricks documentation confirms that Lakehouse Federation uses NNE (not TLS) for non-Oracle Cloud connections: https://docs.databricks.com/aws/en/query-federation/oracle

- Make sure your Oracle RDS security group allows inbound traffic on port 1521 from the Databricks stable egress IPs (which you mentioned you have already done).

- If you are using a SQL warehouse (serverless), make sure you have configured a Network Connectivity Configuration (NCC) and that the stable NAT IPs are whitelisted on the RDS side: https://docs.databricks.com/aws/en/security/network/serverless-network-security/serverless-firewall....

- The "Socket read timed out" variant you saw with the Maven ojdbc driver points to the same root cause: the driver is waiting for the server to respond to its authentication/encryption handshake, but the server has already closed the socket due to algorithm negotiation failure.

If you have tried all the above and are still seeing the issue, I would recommend opening a Databricks support ticket. The support team can inspect the serverless/cluster-side connection logs at a deeper level and help identify the exact negotiation failure point.

* This reply used an agent system I built to research and draft this response based on the wide set of documentation I have available and previous memory. I personally review the draft for any obvious issues and for monitoring system reliability and update it when I detect any drift, but there is still a small chance that something is inaccurate, especially if you are experimenting with brand new features.

luketl2
Contributor

Accepting as answer because it was related to firewall blocking port 1521. After eliminating all the other options we dug deeper on the networking side and found the issue. Thanks!