You are very correct! I’ve hacked it to run with planar (NZTM) coordinates, and it wasn’t pretty, or fun. That said, half of my time was actually spent in converting the OSM data, which was also very hacky - I could convert OSM → planar easily using ogr2ogr (in e.g. sqllite format), but then converting that to osm/pbf for GH was very painful.
Anyway, to brass tacks. I created a test of about 5000 A->B random routes (from Auckland NZ). I did three tests:
- A: using defaults (i.e. CH).
- B: no CH, plus turn costs, plus ALT_ROUTE algorithm + alternative_route.max_paths=10, but only for the first 100 routes.
- C: no CH, plus ASTAR algorithm, but only for the first 100 routes.
Timings
With planar:
A: 3.14s (sd=0.07s) with 99 errors - all 'Connection between locations not found'
. [samples: 3.11, 3.10, 3.07, 3.34, 3.17, 3.13, 3.17, 3.09, 3.07, 3.09, 3.15, 3.17. 3.15]
B: 13.8s (sd=1.0s) with 0 errors. [samples: 13.37, 13.35, 13.41, 15.71, 13.42]
C: 2.29s (sd=0.07s) with 0 errors. [samples: 2.51, 2.27, 2.31, 2.25, 2.26, 2.27, 2.22, 2.26, 2.28, 2.28, 2.30, 2.26, 2.28]
Without:
A: 3.78s (sd=0.03s) with 0 errors. [samples: 3.77, 3.73, 3.80, 3.82, 3.74, 3.80, 3.85, 3.79, 3.80, 3.77, 3.82, 3.75, 3.78]
B: 12.8s (sd=0.7) with 0 errors. [samples: 12.09, 13.40, 13.26, 12.00, 13.32]
C: 2.07s (sd=0.03s) with 0 errors. [samples: 2.07, 2.12, 2.11, 2.07, 2.09, 2.08, 2.08, 2.06, 2.04, 2.11, 2.01, 2.07, 2.06]
Apples and oranges
One the one hand, it’s same computer, and I’ve tried to ensure variables are the “same” (e.g. minResolutionInMeter is same for both). On the other, I’ve done a lot of stuff which I don’t fully understand. For example, in intToDegrees
I just guessed and changed DEGREE_FACTOR
to 10f
, or in bresenham.calcPoints
I just left the hardcoded value at 0.1
(if it was too big, stuff seemed to be missing from the tree … though I thought it would’ve needed to increase for these coordinates). I also did lots of manual hacks while testing and trying to get it to work … some of which may still be lying round = ) Anyway, they might all contribute to the 99 errors above, and the slowness of the non-CH approaches.
Oh, I also removed all of the normedDistance
stuff (as it was confusing) … but I’ve now realised I should have left it in (and just returned the square of the planar distance). Not sure how much that’s slowed it down, but I doubt it’s too much.
Conclusions
Firstly, it needs more investigation by someone who can make a fair comparison! That said, if those errors aren’t falsely speeding it up*, and if I haven’t done something silly to give misleading results, then it looks pretty good for CH. Especially as it’s essentially a net gain (both in terms of speed / simplicity) for “free” (i.e. no algorithmic changes required, just using a simpler coordinate system). [In my case, it allows me to do some stuff which I’m not sure I could in WGS84.]
As for the other algorithms - I’m not sure what’s going on there, as I can’t see how simpler calculations would have slowed it down. So I’m guessing it’s to do with how I utilised some of the internal data structures (which maybe didn’t affect the CH graph), or one of the issues above.
Where did the speedups occur? Mostly just replacing distance etc. calculations with simple Pythagorus, plus a few simplifications e.g. not needing to check for longitude boundary crossings.
Could it be useful? Maybe. If the data used is covered by an (accurate) planar coordinate system, then it’s possible. However, as OSM only supports WGS84, we’d either need to implement coordinate conversion on the import (which shouldn’t be pretty trivial with e.g. Proj4J), or find an alternative tool for changing the coordinates in the OSM data.
How hard to implement (well)? I don’t really know, but I don’t think it’d be that bad. All I really added was a new DistanceCalc
for planar coordinates. The rest would just be tidying up the code to be e.g. have a configurable DistanceCalc
etc., and maybe changing lat, lon, ele
to x, y, z
, or similar.
Finally, I’m assuming most of the speedup occurs in the location lookups. With that in mind, it’s possible it’ll have a greater impact in map-matching.
@karussell, what do you think?
- @karussell, do you know whether they’d speed it up or slow it down? At most the speedup would be like ignoring 99 / 5000 tests, which doesn’t account for all of the speedup.