cancel
Showing results for 
Search instead for 
Did you mean: 
Data Engineering
Join discussions on data engineering best practices, architectures, and optimization strategies within the Databricks Community. Exchange insights and solutions with fellow data engineers.
cancel
Showing results for 
Search instead for 
Did you mean: 

Not able to connect to Neo4j Aura Db from databricks

saab123
New Contributor II

I am trying to connect to a Neo4j AuraDb instance-f9374927.

 

  1. Created a free professional instance of Neo4j. I am able to connect to this instance, add nodes and relationships.

 

 

 

  1. Created a Databricks shared cluster 14.3 LTS (includes Apache Spark 3.5.0, Scala 2.12). Added maven coordinates org.neo4j:neo4j-connector-apache-spark_2.12:5.3.6_for_spark_3 and ran the code below.

 

  1. I am getting the can’t connect error. After looking at this support question- https://community.databricks.com/t5/data-engineering/how-to-connect-neo4j-aura-to-a-cluster/td-p/156..., I added the following init script to my server configuration. But the server won’t start after adding the init script.

#!/bin/bash
echo "Configuring Databricks to connect to Neo4j"
# Ensure Java security settings are correctly configured (if needed for Neo4j Aura).
PROPERTIES_FILE="/databricks/spark/dbconf/java/extra.security"
DISABLED_ALGOS="SSLv3, RC4, DES, MD5withRSA, DH keySize < 1024, EC keySize < 224, 3DES_EDE_CBC, anon, NULL"if [[ -f "${PROPERTIES_FILE}" ]]; then
  echo "Setting jdk.tls.disabledAlgorithms..."
  echo "jdk.tls.disabledAlgorithms=${DISABLED_ALGOS}" | tee -a "${PROPERTIES_FILE}"
else
  echo "ERROR: Failed to find ${PROPERTIES_FILE}"
  exit 1
fi

 

 

 

Here are the console and driver logs. Please help resolve this issue.

Console Logs:

 

Py4JJavaError: An error occurred while calling o571.save. : org.neo4j.driver.exceptions.ServiceUnavailableException: Could not perform discovery for database '<default database>'. No routing server available.

 

Driver Logs:

25/04/07 18:22:44 WARN DynamicSparkConfContextImpl: Ignored update because id 1744046542240 < 1744046542240; source: CONFIG_FILE

25/04/07 18:22:44 INFO DatabricksILoop$: Received SAFEr configs with version 1744046542240

25/04/07 18:23:05 WARN RediscoveryImpl: Received a recoverable discovery error with server 'f9374927.databases.neo4j.io(54.146.202.10):7687', will continue discovery with other routing servers if available. Complete failure is reported separately from this entry.

25/04/07 18:23:36 WARN RediscoveryImpl: Received a recoverable discovery error with server 'f9374927.databases.neo4j.io(54.160.21.207):7687', will continue discovery with other routing servers if available. Complete failure is reported separately from this entry.

25/04/07 18:23:44 WARN DynamicSparkConfContextImpl: Ignored update because id 1744046542240 < 1744046542240; source: CONFIG_FILE

25/04/07 18:23:44 INFO DatabricksILoop$: Received SAFEr configs with version 1744046542240

25/04/07 18:23:48 WARN JupyterKernelListener: Received Jupyter debug message with unknown command: null

25/04/07 18:24:06 WARN RediscoveryImpl: Received a recoverable discovery error with server 'f9374927.databases.neo4j.io(23.23.29.38):7687', will continue discovery with other routing servers if available. Complete failure is reported separately from this entry.

25/04/07 18:24:06 INFO ProgressReporter$: Removed result fetcher for 1744047307476_5211541610516211935_0475a19e17354317b2232a91e26be872

25/04/07 18:24:06 INFO PresignedUrlClientUtils$: FS_OP_CREATE FILE[https://databricks-mda-workspace-cloud-storage-bucket.s3.us-west-2.amazonaws.com/oregon-prod/1637039...] Presigned URL: Started uploading stream using AwsPresignedUrl

0 REPLIES 0