Hi - I have tried my level best to go through both elasticsearch documentation as well as Databricks documentation to get an answer for my question - is it possible to connect to AWS elasticsearch of a different AWS account from Databricks? I did not find an answer. I am writing code in PySpark.
Apache Spark - 3.12 , DBR : Standard 9.2 LTS
as mentioned in Databricks documentation and was running the following code. It did not help at all.
df = (spark.read
.format( "org.elasticsearch.spark.sql" )
.option( "es.nodes", hostname )
.option( "es.port", port )
.option( "es.net.ssl", 'true' )
.option( "es.nodes.wan.only", "true" )
.load( f"{index}" )
)
The error I am getting is : AccessControlException: Permission denied: user [***] does not have [read] privilege on [dbfs:/***] ( I have masked the user and the ES name )
I think the right way is to do an assume role using boto3 and then somehow configure Spark session to assume the IAM role. Am I correct?
Is it not do-able through PySpark?