Observation specific accuracy

Most GPS devices produce some estimation of the accuracy of the lat/lon in metres, almost guarenteed precision, i.e. < 5m (espcially with 5G devices) to a “we’re just guessing”, i.e. > 1000m.

Currently MapMatching just uses a global accuracy parameters, measurementErrorSigma, and I have a feeling that results are sensitive to the value of that parameter. Certainly, anecdotally from my data, changes of only a metres can determine whether the correct path is followed. It would be very useful to be able to include observation specific accuracy if they are available. Note, this is possible in OSRM.

I presume this will involve adding an “accuracy” field to the Observation class and using that in place of measurementErrorSigma in the MapMatching class, which looks fairly straight forward. However it also seems like it requires changes to the computeViterbiSequence method, which may be more involved.

If anyone has any knowledge of whether this has already been tried it’d be good to know. If not I’ll try to code something.