'OutOfMemoryError' when building with large pbf files

Hi everyone,

I am trying to set up my own graphhopper server. When I run ./graphhopper.sh web europe_belgium.pbf everything works flawlessly.

The problem only occurs when trying to run the entire european pbf file. (command: ./graphhopper.sh web europe.pbf.

When I do this, the program starts running and after about 10 mins I get an OutOfMemoryError.

When this happens; there is enough storage space left, and RAM did not exceed 21%.

here is the code from the moment I get the error;
Exception in thread "pool-3-thread-2" java.lang.OutOfMemoryError: GC overhead limit exceeded at org.openstreetmap.osmosis.osmbinary.Osmformat$Way$1.parsePartialFrom(Osmformat.java:10593) at org.openstreetmap.osmosis.osmbinary.Osmformat$Way$1.parsePartialFrom(Osmformat.java:10588) at com.google.protobuf.CodedInputStream.readMessage(CodedInputStream.java:495) at org.openstreetmap.osmosis.osmbinary.Osmformat$PrimitiveGroup.<init>(Osmformat.java:3441) at org.openstreetmap.osmosis.osmbinary.Osmformat$PrimitiveGroup.<init>(Osmformat.java:3368) at org.openstreetmap.osmosis.osmbinary.Osmformat$PrimitiveGroup$1.parsePartialFrom(Osmformat.java:3496) at org.openstreetmap.osmosis.osmbinary.Osmformat$PrimitiveGroup$1.parsePartialFrom(Osmformat.java:3491) at com.google.protobuf.CodedInputStream.readMessage(CodedInputStream.java:495) at org.openstreetmap.osmosis.osmbinary.Osmformat$PrimitiveBlock.<init>(Osmformat.java:2421) at org.openstreetmap.osmosis.osmbinary.Osmformat$PrimitiveBlock.<init>(Osmformat.java:2356) at org.openstreetmap.osmosis.osmbinary.Osmformat$PrimitiveBlock$1.parsePartialFrom(Osmformat.java:2471) at org.openstreetmap.osmosis.osmbinary.Osmformat$PrimitiveBlock$1.parsePartialFrom(Osmformat.java:2466) at com.google.protobuf.AbstractParser.parsePartialFrom(AbstractParser.java:137) at com.google.protobuf.AbstractParser.parseFrom(AbstractParser.java:168) at com.google.protobuf.AbstractParser.parseFrom(AbstractParser.java:180) at com.google.protobuf.AbstractParser.parseFrom(AbstractParser.java:185) at com.google.protobuf.AbstractParser.parseFrom(AbstractParser.java:49) at org.openstreetmap.osmosis.osmbinary.Osmformat$PrimitiveBlock.parseFrom(Osmformat.java:2722) at com.graphhopper.reader.osm.pbf.PbfBlobDecoder.processOsmPrimitives(PbfBlobDecoder.java:369) at com.graphhopper.reader.osm.pbf.PbfBlobDecoder.runAndTrapExceptions(PbfBlobDecoder.java:393) at com.graphhopper.reader.osm.pbf.PbfBlobDecoder.run(PbfBlobDecoder.java:408) at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142) at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617) at java.lang.Thread.run(Thread.java:745) Exception in thread "pool-3-thread-3" java.lang.OutOfMemoryError: GC overhead limit exceeded at java.util.Collections$UnmodifiableCollection.iterator(Collections.java:1038) at com.graphhopper.reader.osm.pbf.PbfBlobDecoder.buildTags(PbfBlobDecoder.java:144) at com.graphhopper.reader.osm.pbf.PbfBlobDecoder.processWays(PbfBlobDecoder.java:279) at com.graphhopper.reader.osm.pbf.PbfBlobDecoder.processOsmPrimitives(PbfBlobDecoder.java:377) at com.graphhopper.reader.osm.pbf.PbfBlobDecoder.runAndTrapExceptions(PbfBlobDecoder.java:393) at com.graphhopper.reader.osm.pbf.PbfBlobDecoder.run(PbfBlobDecoder.java:408) at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142) at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617) at java.lang.Thread.run(Thread.java:745)

Any help, pointing me in the right direction would be much apreciated.

Thank you in advance.

Have you tried using more memory like with export JAVA_OPTS="-Xmx5g -Xms5g"? And what are some previous log statements?

Hey @karussell,

Iā€™ve tried your suggestion yesterday evening and today. Using the option keeps the problem at bay for longer but it just comes up later. Iā€™ve tried it also with export JAVA_OPTS="-Xmx10g -Xms10g" but no luck either.

Currently testing this with MMAP instead of RAM, but Iā€™m not getting my hopes up.

Previous log statements are mostly warnings from osmreader concerning the parsing of certain data like " date: week 46".

Currently building on a system with 12gig of RAM (I was thinking this should be enough for only europe).

I have also tried it using a cloud vm with 32gig of RAM, running into the same problem. (that was without the export JAVA_OPTS="-Xmx5g -Xms5g" command.)

I have also tried it using a cloud vm with 32gig of RAM, running into the same problem. (that was without the export JAVA_OPTS=ā€œ-Xmx5g -Xms5gā€ command.)

Then the JVM uses default memory, which is far less than 5gb. Even if you have a lot more RAM on the machine. MMAP is a lot slower and requires different heap config (smaller size so that off-heap enough RAM is available)

Currently building on a system with 12gig of RAM (I was thinking this should be enough for only europe).

Should be enough, yes. But I didnā€™t try just Europe for a long time. Which version are you using and again the full log would be help full, maybe point to some external system to avoid cluttering the chat :wink:

I am using the latest version pulled from the github master repo and using jdk 8.
system is the latest version of ubuntu.

All building is done from a clean install, just installed git, jdk8 and maven.

I have placed the entire build log on a website of mine; here is the link:
build log

I hope Iā€™m not breaking any rules by using a link like this. It was the first solution that came to mind on how not to clutter the chat. :slight_smile:

How many RAM do you have on your machine? Try to assign all the available RAM to the JVM. What das jdk8 mean - openjdk or oraclejdk?

The import was successfully, the CH preparation has started but too few RAM is available now:

ā€œtotalMB:4551, usedMB:4096ā€

The MMAP setting still requires at certain points like for the CH preparation several GB of RAM.

So either keep MMAP and slightly increase the xmx settings until success (try 7g) or again: increase the xmx to the maximal available (probably minus 1gb for the OS). Use 12gb or better 15gb of RAM, for world wide you need roughly twice the size.

It is good. Maybe we should enable .log and .txt files for upload here too ā€¦

So today I tried it on another system I have; Using RAM setting instead of MMAP. using export JAVA_OPTS="-Xmx10g -Xms10g".

No luck so far. The system has 20GB of RAM. Going to try tomorrow using the MMAP option on that system, hoping that that will work.

Iā€™ll keep you posted

@karussell, I got it working usiing export JAVA_OPTS="-Xmx10g -Xms10g" and MMAP.

Takes forever, but it finishes :slight_smile:

Thank you for your help.

What does ā€˜takes foreverā€™ mean? As the system has 20g, did you also try -Xmx19500m and RAM_STORE?

It took around 16 hours. No I did not try the 19.5gig ram store. Should I need to recompile, I will give it a try

1 Like