cancel
Showing results forย 
Search instead forย 
Did you mean:ย 
Data Engineering
Join discussions on data engineering best practices, architectures, and optimization strategies within the Databricks Community. Exchange insights and solutions with fellow data engineers.
cancel
Showing results forย 
Search instead forย 
Did you mean:ย 

How to connect Neo4j aura to a cluster

Nyarish
Contributor

Please help resolve this error :

org.neo4j.driver.exceptions.SecurityException: Failed to establish secured connection with the server

This occurs when I want to establish a connection to neo4j aura to my cluster .

Thank you.

1 ACCEPTED SOLUTION

Accepted Solutions

-werners-
Esteemed Contributor III

ok so I tested it for you.

with the init script this is in extra.security:

jdk.tls.disabledAlgorithms=SSLv3, RC4, DES, MD5withRSA, DH keySize < 1024, EC keySize < 224, 3DES_EDE_CBC, anon, NULL

without the init script, I get the same as you.

So the script itself works.

View solution in original post

17 REPLIES 17

Nyarish
Contributor

any help guys?

-werners-
Esteemed Contributor III

https://aura.support.neo4j.com/hc/en-us/articles/1500003161121

Aura needs some additional setup on your cluster.

Nyarish
Contributor

hi @Werner Stinckensโ€‹ thanks for the feedback. I've gone through that articles several times, but I still get the same error. what could I be missing with my Databricks setup ?

-werners-
Esteemed Contributor III

The shell script will disable some encryption algorithms.

What you can do is check the logs if the bash script gives any errors.

Also check if the /databricks/spark/dbconf/java/extra.security is actually there.

And also check the settings which are set in the script.

That's about it what I can think of. You could also try to use different versions of databricks, or check the databricks log, because maybe you have to disable/enable a certain algorithm which is not in the bash script.

Thinking out loud here though.

Nyarish
Contributor

I am new in this...

/databricks/spark/dbconf/java/extra.security is actually there.

And also check the settings which are set in the script.

please can you shed more light

-werners-
Esteemed Contributor III

you can check the content of the extra.security file using a notebook:

%sh

cat /databricks/spark/dbconf/java/extra.security

This should be what is in the article.

The script will set the jdk.tls.dsiabledAlgorithms parameter in java.

So you can check the java parameters on your cluster with

%sh

java -XshowSettings:all -version

the disabledAlgorithms should be set.

If everything is set as it should be, you will have to check logs on databricks side and neo4j side to see what is going on.

Nyarish
Contributor

First of all, thank you for taking the time to help. Please if you don't mind, I am newbie, kindly guide me on the steps that I need or what to do next...

Here is a screenshot ...what next from here

Screenshot 2021-09-13 at 11.20.40 

here is the error from notebook to neo4j aura database

Screenshot 2021-09-13 at 11.31.03

-werners-
Esteemed Contributor III

the value GCM being present, don't know if it is that. The comment in the file says it has to be present though.

are you sure it is not an issue at the neo4j side? f.e. allowing external connections etc.

Nyarish
Contributor

let me ask...how would you run this code of block in the the notebook. since I get an error loading this script

#!/bin/sh

PROPERTIES_FILE="/databricks/spark/dbconf/java/extra.security"

DISABLED_ALGOS="SSLv3, RC4, DES, MD5withRSA, DH keySize < 1024, EC keySize < 224, 3DES_EDE_CBC, anon, NULL"

echo "Configure Databricks java for Aura access"

if [[ -f "${PROPERTIES_FILE}" ]]; then

  echo "setting jdk.tls.disabledAlgorithms..."

  echo "jdk.tls.disabledAlgorithms=${DISABLED_ALGOS}" | tee "${PROPERTIES_FILE}"

else

  >&2 echo "ERROR failed to find ${PROPERTIES_FILE}"

fi

-werners-
Esteemed Contributor III

this script has to run at cluster startup.

What is does is changing this properties_file. I don't think you can run it in a notebook.

what error do you get when loading the script?

Nyarish
Contributor

Screenshot 2021-09-13 at 11.58.27

-werners-
Esteemed Contributor III

you can't run this script in a notebook.

You have to provide it to the cluster in the cluster settings (Init Scripts).

The script has to be present on the location you put it (f.e. Filestore in dbfs)

Nyarish
Contributor

I guess this setting option is not on the Community version?

-werners-
Esteemed Contributor III

can't say, I am not on that version.

It should be here:

screenshot

Connect with Databricks Users in Your Area

Join a Regional User Group to connect with local Databricks users. Events will be happening in your city, and you wonโ€™t want to miss the chance to attend and share knowledge.

If there isnโ€™t a group near you, start one and help create a community that brings people together.

Request a New Group