Questions on migration from GH <0.1 to 0.9

My GH application is a horse and hike navigation. It was developed before the first stable release of GH. Now I am trying to migrate it to 0.9 stable. As there has been a lot of progress since that time there are some questions you can hopefully help me with. I use flexible routing with custom and dynamic weighting.

  • Obviously, the trove library is no longer used by GH. What is the replacement e.g. for TIntArrayList?
  • I found the docs on creating a new Weighting. But for my usecase I need to provide some custom routing parameters like “relaxed hiking” or “include steep inclines” and somehow give the weighting class access to them. Is there a recommended way (and maybe example) how to do this?
  • I think I will need to create a new elevation provider for the DeFerranti DEM in order to cover regions above 60°. Does such a EP already exist? If not would this be of interest to GH (Or rather not because of the proprietary license)?

your help would be appreciated
Nop

1 Like

I wish you good luck and success with this, do not hesitate to let us know if some problems happen or report back about the progress :slight_smile:

Obviously, the trove library is no longer used by GH. What is the replacement e.g. for TIntArrayList?

we now use hppc and it is just IntArrayList, see an example here.

and somehow give the weighting class access to them. Is there a recommended way (and maybe example) how to do this?

Yes, you can still use the same procedure and you’ll have access to all URL parameters via the HintsMap parameter

Does such a EP already exist? If not would this be of interest to GH (Or rather not because of the proprietary license)?

We currently added a few more EPs:

https://github.com/graphhopper/graphhopper/pulls?q=is%3Apr+elevation+is%3Aclosed

And you see the mixed provider which might be interesting for you, but we do not have DeFerranti DEM. The proprietary license for the data is no problem (although we highly prefer open data), we can still have the source code open :slight_smile:

Also see all the other improvements in this topic, see open pull requests when you search for ‘elevation’

Thanks - I already have the first issue. :slight_smile:

The two implementations are not fully compatible. I am using void insert(int index, int data) which is not supported by hppc. Doing multiple calls with 1 value is inefficient because each causes an arraycopy. I extended GHIntArrayList with the missing method.

Can you please elaborate on that? So the parameters are supposed to be sent as Hints?
(I am still trying to figure out whether it is easier to use the GH web interface or keep my old implementation).

I noted the multi EP improvement - alas that’s after 0.9 stable. I still think I’ll try to create a DeFerranti EP, to get the usual 90m mesh instead of the 250m of gmted.

Yes, if we use the current GraphHopperServlet you get all URL parameters in this map, see the createWeighting method in GraphHopper and every weighting constructor like FastestWeighting

Sounds great, please keep us updated :).

If thinks such an EP would be indeed very interesting for GH. Not sure what your timeline is on this feature, but maybe we can merge this PR, before you start creating another PR for VFP’s DEM. In this PR a lot of the methods that are used in different EPs are abstracted which should make it quite easy to write a new EP. @karussell IIRC the mentioned PR should be mergeable. It would be interesting to compare the results including the road smoothing (#1220).

BTW: Interesting alternatives to the VFP DRMs could be the ALOS AW3D30 data or the ASTER GDEM data (the latter also has a questionable license).

Cheers,
Robin

As for updates: The simpler of my FlagEncoders (for hiking) with custom weighting is compiling, import and a test routing case are running without errors. I still have no idea whether the graph and routing result make any sense, that I need to find out now. :slight_smile:

@karussell: I have another question: In the old version there was a parameter osmreader.instructions=false to disable gathering the required data during import. I don’t find it anymore. Is it deprecated? Is there still the option to leave out instruction data (which I still don’t need)?

@boldtrn: As I am working based on 0.9 that PR is out of scope. My timeline is tight, I have only one week and part of that time is given to my daughters. :slight_smile: Actually I already have the code as part of my Garmin map generator Map Composer. Actually, this is where the first version of GH elevation provider came from, but the structure has changed over time. The biggest challenge with VFP is the download, the files are arranged in various unpredictable packages.

After checking out multiple reviews and comparisons of SRTM1, JAXA and ASTER data sets, I still believe that CGIAR and VFP provide the best quality of void filling because they received some manual attention. I used to build the elevation contours for my map from SRTM3 and always had trouble with completely nonsensical results like missing mountains or downhill lakes for water skiing. :slight_smile: Therefore I prioritize quality over detail level.

1 Like

That should be datareader.instructions now.
Also can check changelog for help in the migration.

1 Like

I have routing with my HikeFlagEncoder basically working now. But in one point I am not sure whether I am doing something harmful.

I need to know about hiking relations during import. I do not need any relation information during routing as it is all calculated into the weight during import. There is a new method handleRelationTags(). I set some bits there, abusing way flags, and expect them to be delivered to handleWayTags() later. I do not define any bits in defineRelationBits() as I don’t need them stored.
Is this correct or am I potentially breaking something?

Yes, this should work (using the way flags)

Another thought (progress must be good if I am thinking about new features :slight_smile: ): Is there a way to include information from the way flags into the routing result? I am marking some edges as preferred and the routing uses them. But I would like to visualize on the map showing the route which parts were selected because they are preferred.

Alas, it does not. The Encoding manager applies a mask to the relationFlags, so they must be defined properly.

One more note: During import I spotted estimated_distance and estimated_center in the ReaderWays properties. This led me to a significant amount of calculations which apply only to turn costs and maybe spatial evaluations, but are always executed. If it hasn’t already been done after 0.9, this looks like some room for optimization. I use a stripped copy of OSMReader now with that code removed.

As for progress: Horse navigation seems to basically work again in a small sample region. Now I need to build the VFP elevation provider to upscale to the full map area.

Yes, you can do this. See eg. the Path details which uses any changes in the edges, see the closed PRs: Pull requests ¡ graphhopper/graphhopper ¡ GitHub as well as the issues and this forum.

As for progress: Horse navigation seems to basically work again in a small sample region. Now I need to build the VFP elevation provider to upscale to the full map area.

Cool, nice!

Interesting feature. I can see the difficulty of somehow preserving the detail information through the Douglas Peucker. I think I’ll store that idea for later, when I’m migrating to GH 0.10. :slight_smile:

Another question: Looking at the SRTM elevation provider I noted that the files used by GH are 3.13 MB in size while the original .hgt is 2.9 MB. There seems to be a short header, but most of the difference appears to be just filled with 0. Is this a bug or a feature?

This is already handled from the framework and preserves details where necessary :slight_smile:

There seems to be a short header, but most of the difference appears to be just filled with 0. Is this a bug or a feature?

There is a fixed length but short header which is indeed required for our storage (always was). But this is probably not the reason as it is just a few bytes. I think the reason is that our storage format uses per default 1MB segments (this is changeable e.g. to 100KB but not sure if there will be performance changes)

The VFP elevation provider is working now. Next steps are to test everything with the real data set for the whole map and get it into a deployable .jar file.

I tried to implement all required changes by subclassing and keep GH 0.9 unmodified. That almost worked. There is a method HeightTile.setHeights() with package scope which I needed to declare public in order to use the class with the VFP provider in my own package. (still package scope in current master)

A rather trivial extension was adding the missing insert(int index, int[] data) to IntArrayList.

I derived a VFP elevation provider from the SRTM provider as they use the same .hgt files. Main difference is the method of finding the right zips to download. It is based on the 0.9 version before the refactoring for multi elevation provider. I do not have time to port it to current master, build a test and make a proper PR. But if you are interested, I’d be happy to hand over the VFP class.

One more question: For testing the import I am running in the context of the reader-osm module. Unfortunately, log4j ist not configured there so I get just a warning and all output is lost. Running in the core module everything logs just fine. Aunt Google says that I need a log4j.properties file in the classpath, but I cannot find any such file in the GH repository, so it must be a different mechanism used by GH core.

What do I need to do to enable logging in the reader-osm module?

reader-osm and reader-core are just libraries. Have a look into the tools or web module where there is a log4j config file: https://github.com/graphhopper/graphhopper/tree/master/web/src/main/resources

I copied the file from tools into reader-osm. The warning for a missing config file disappeared but no output is showing up on the console.

You need it in your application its root classpath, no need to modify the libraries. E.g. when you run the web or import process you see that it works.