cancel
Showing results for 
Search instead for 
Did you mean: 
Data Engineering
Join discussions on data engineering best practices, architectures, and optimization strategies within the Databricks Community. Exchange insights and solutions with fellow data engineers.
cancel
Showing results for 
Search instead for 
Did you mean: 

Unable to read data from Elasticsearch with spark in Databricks.

Data_Engineer3
Contributor III

When I am trying to read data from elasticsearch by spark sql, it throw an error like

RuntimeException: Error while encoding: java.lang.RuntimeException: scala.collection.convert.Wrappers$JListWrapper is not a valid external type for schema of string

Caused by: RuntimeException: scala.collection.convert.Wrappers$JListWrapper is not a valid external type for schema of string

It show like schema generated with spark is not matching with data received from elasticsearch.

Could you let know how I can read the data from elastic via either csv, or excel format?

4 REPLIES 4

AmanSehgal
Honored Contributor III

How are you reading data from Elastic search?

Are you exporting data from ES in JSON or CSV format and then reading it via Spark or directly connecting to ES?

If you're connecting directly, then you can use following snippet:

df = (spark.read
      .format( "org.elasticsearch.spark.sql" )
      .option( "es.nodes",   hostname )
      .option( "es.port",    port     )
      .option( "es.net.ssl", ssl      )
      .option( "es.nodes.wan.only", "true" )
      .load( f"index/{index}" )
     )
 
display(df)

If you're exporting in say JSON format using elastic dump service then use the following code snippet:

df = spark.read.json("<dbfs_path>/*.json").select("_id","_source.*")

This is because your file is exported as follows:

_id:string
_index:string
_score:long
_source:struct
         col_1:<data_type>
         col_2:<data_type>
         col_3:<data_type>
        col_4:<data_type>
        col_n:<data_type>

All your columns are nested inside _source.

Hope this helps.

Hi @Aman Sehgal​ 

I am trying to read elastic data by directly connect to it.

I am using below snippet

df = spark.read.format("org.elasticsearch.spark.sql")

 .option("es.read.metadata", "false")

 .option("spark.es.nodes.discovery", "true")

 .option("es.net.ssl", "false")

 .option("es.index.auto.create", "true")

 .option("es.field.read.empty.as.null", "no")

 .option("es.read.field.as.array.exclude","true")

 .option("spark.serializer", "org.apache.spark.serializer.KryoSerializer")

 .option("es.nodes", "*")

 .option("es.nodes.wan.only", "true")

 .option("es.net.http.auth.user", elasticUsername)

 .option("es.net.http.auth.pass", elasticPassword)

 .option("es.resource", "indexname")

But I am getting runtime error showing that

RuntimeException: Error while encoding: java.lang.RuntimeException: scala.collection.convert.Wrappers$JListWrapper is not a valid external type for schema of string

Caused by: RuntimeException: scala.collection.convert.Wrappers$JListWrapper is not a valid external type for schema of string

do you have solution to it?

Note: i think error due to schema getting generated by spark is not matching with schema present in elastic.

Thanks

Prabakar
Databricks Employee
Databricks Employee

I believe this could be a known bug reported on the Elasticsearch Spark connector for Spark 3.0.

This connector is maintained by the Open source community and we don't have any ETA on the fix yet.

Bug details:

https://github.com/elastic/elasticsearch-hadoop/issues/1635

You can look for the latest connector to support Spark3.0 in Maven repo.

What is the DBR version that you are using for the cluster?

Vidula
Honored Contributor

Hi there @KARTHICK N​ 

Hope all is well! Just wanted to check in if you were able to resolve your issue and would you be happy to share the solution or mark an answer as best? Else please let us know if you need more help. 

We'd love to hear from you.

Thanks!

Connect with Databricks Users in Your Area

Join a Regional User Group to connect with local Databricks users. Events will be happening in your city, and you won’t want to miss the chance to attend and share knowledge.

If there isn’t a group near you, start one and help create a community that brings people together.

Request a New Group