โ09-11-2021 09:42 PM
Please help resolve this error :
org.neo4j.driver.exceptions.SecurityException: Failed to establish secured connection with the server
This occurs when I want to establish a connection to neo4j aura to my cluster .
Thank you.
โ09-13-2021 03:06 AM
ok so I tested it for you.
with the init script this is in extra.security:
jdk.tls.disabledAlgorithms=SSLv3, RC4, DES, MD5withRSA, DH keySize < 1024, EC keySize < 224, 3DES_EDE_CBC, anon, NULL
without the init script, I get the same as you.
So the script itself works.
โ09-12-2021 09:20 PM
any help guys?
โ09-12-2021 11:55 PM
https://aura.support.neo4j.com/hc/en-us/articles/1500003161121
Aura needs some additional setup on your cluster.
โ09-13-2021 12:00 AM
hi @Werner Stinckensโ thanks for the feedback. I've gone through that articles several times, but I still get the same error. what could I be missing with my Databricks setup ?
โ09-13-2021 12:07 AM
The shell script will disable some encryption algorithms.
What you can do is check the logs if the bash script gives any errors.
Also check if the /databricks/spark/dbconf/java/extra.security is actually there.
And also check the settings which are set in the script.
That's about it what I can think of. You could also try to use different versions of databricks, or check the databricks log, because maybe you have to disable/enable a certain algorithm which is not in the bash script.
Thinking out loud here though.
โ09-13-2021 12:14 AM
I am new in this...
/databricks/spark/dbconf/java/extra.security is actually there.
And also check the settings which are set in the script.
please can you shed more light
โ09-13-2021 01:15 AM
you can check the content of the extra.security file using a notebook:
%sh
cat /databricks/spark/dbconf/java/extra.security
This should be what is in the article.
The script will set the jdk.tls.dsiabledAlgorithms parameter in java.
So you can check the java parameters on your cluster with
%sh
java -XshowSettings:all -version
the disabledAlgorithms should be set.
If everything is set as it should be, you will have to check logs on databricks side and neo4j side to see what is going on.
โ09-13-2021 01:24 AM
โ09-13-2021 01:44 AM
the value GCM being present, don't know if it is that. The comment in the file says it has to be present though.
are you sure it is not an issue at the neo4j side? f.e. allowing external connections etc.
โ09-13-2021 01:49 AM
let me ask...how would you run this code of block in the the notebook. since I get an error loading this script
#!/bin/sh
PROPERTIES_FILE="/databricks/spark/dbconf/java/extra.security"
DISABLED_ALGOS="SSLv3, RC4, DES, MD5withRSA, DH keySize < 1024, EC keySize < 224, 3DES_EDE_CBC, anon, NULL"
echo "Configure Databricks java for Aura access"
if [[ -f "${PROPERTIES_FILE}" ]]; then
echo "setting jdk.tls.disabledAlgorithms..."
echo "jdk.tls.disabledAlgorithms=${DISABLED_ALGOS}" | tee "${PROPERTIES_FILE}"
else
>&2 echo "ERROR failed to find ${PROPERTIES_FILE}"
fi
โ09-13-2021 01:54 AM
this script has to run at cluster startup.
What is does is changing this properties_file. I don't think you can run it in a notebook.
what error do you get when loading the script?
โ09-13-2021 01:59 AM
โ09-13-2021 02:22 AM
you can't run this script in a notebook.
You have to provide it to the cluster in the cluster settings (Init Scripts).
The script has to be present on the location you put it (f.e. Filestore in dbfs)
โ09-13-2021 02:29 AM
I guess this setting option is not on the Community version?
โ09-13-2021 02:31 AM
Join a Regional User Group to connect with local Databricks users. Events will be happening in your city, and you wonโt want to miss the chance to attend and share knowledge.
If there isnโt a group near you, start one and help create a community that brings people together.
Request a New Group