Calling hopper.importOrLoad from Spark

I try to use GraphHopper within my Spark-Workflow but receive the following error:

Exception in thread "pool-2-thread-1" java.lang.NoSuchMethodError: com.google.protobuf.LazyStringList.getUnmodifiableView()Lcom/google/protobuf/LazyStringList; at org.openstreetmap.osmosis.osmbinary.Osmformat$HeaderBlock.<init>(Osmformat.java:300) at org.openstreetmap.osmosis.osmbinary.Osmformat$HeaderBlock.<init>(Osmformat.java:185) at org.openstreetmap.osmosis.osmbinary.Osmformat$HeaderBlock$1.parsePartialFrom(Osmformat.java:321) at org.openstreetmap.osmosis.osmbinary.Osmformat$HeaderBlock$1.parsePartialFrom(Osmformat.java:316) at com.google.protobuf.AbstractParser.parsePartialFrom(AbstractParser.java:141) at com.google.protobuf.AbstractParser.parseFrom(AbstractParser.java:176) at com.google.protobuf.AbstractParser.parseFrom(AbstractParser.java:188) at com.google.protobuf.AbstractParser.parseFrom(AbstractParser.java:193) at com.google.protobuf.AbstractParser.parseFrom(AbstractParser.java:49) at org.openstreetmap.osmosis.osmbinary.Osmformat$HeaderBlock.parseFrom(Osmformat.java:749) at com.graphhopper.reader.pbf.PbfBlobDecoder.processOsmHeader(PbfBlobDecoder.java:87) at com.graphhopper.reader.pbf.PbfBlobDecoder.runAndTrapExceptions(PbfBlobDecoder.java:390) at com.graphhopper.reader.pbf.PbfBlobDecoder.run(PbfBlobDecoder.java:412) at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142) at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617) at java.lang.Thread.run(Thread.java:745)

My first spark-job was written in Scala, so I thought this might lead to the error. See:
Github Issue

Now I re-implemented the setup with java but get the same error. Might there be some version conflict since both spark and graphhopper use protobuf? I checked that com/google/protobuf/LazyStringList.class is contained within my jar file.

If it helps I can provide some git-Repo with my setup

That could be indeed a problem. If we use an old version we could easily fix this and use the same version. Or if you don’t need to import the area directly in your spark app you could do it separately e.g. via the import script and then it’ll just load it on importOrLoad. Or try with some exclude / include magic.

lots of great ideas. I will try them and give you feedback :slight_smile: thanks so far

This topic was automatically closed 8 days after the last reply. New replies are no longer allowed.