Benjamin Benjamin - 6 months ago 39
Scala Question

Insert geograpic data in Elastic Search from Spark

I try to upload an RDD with a latitude and a longitude fields in my ES. I would like to use the geo_point type to plot them on a map. I tried to create a "location" field for each document containing either a string like "12.25, -5.2" or a array of two doubles for lat/long but ES does not detect them as a geo_point. The index does not exist before I insert data.

How can I tell ES that location is a geo_point?

Current code with the elasticsearch-hadoop lib to store:

myRDD.saveToEs(indexName, someConf)

with myRDD an RDD[Map] containing a "location" -> [double, double]

and someConf contains "" -> "true"


A working solution as suggested in comments:

  1. HTTP PUT to create an index with a mapping only this particular field
  2. Insert the RDD normally

If you have inserted data before it is too late