Please pardon my terminology - I’m fairly new to Graphhopper.
I’m generating a new graphhopper map/nodes/edges from an OSM file using GraphHopperOSM. I’m using a custom encoder that adds a new parameter to each edge. Let’s call this new parameter the “danger” value. I have a separate table that maps various coordinates (longitude and latitude) to integer danger values. When generating the graph edge, (i think in the applyWayTags method) I want to be able to check if each edge I’m given matches any of the coordinates in my table, and, if so, assign it the danger value in my table. Otherwise, just assign a default.
I’m not sure how to do this. My original idea was to turn my table of coordinates-danger, into a table of edges-danger. I would then be able to easily check if each edge passed into my applyWayTags method was in my table of edges-danger. I could do this by turning each coordinate into a GeoPoint. I could then turn each GeoPoint into its closest edge by creating a LocationIndex instance from hopper.getLocationIndex, and using the LoactionIndex.findClosest function. However, I can’t do this until the edges of my “hopper” have been generated, by which point all the danger values should have been assigned because the OSM file will have already been translated into the nodes and edges used by graphhopper. It’s a bit of a Catch 22.
How can I solve this problem? Is there another way for me to easily to see if an edge exists within a table of coordinates?
Thank you very much.