Estimate of required memory to build graph from osm.pbf

Hi,

I’m writing a basic UI wizard for building a road network graph, to use with ODL Studio. I’ll be using it with both GH 0.5 and the latest GH code. I want to warn the user when java has less memory available than it would likely need for building the graph. This means I need a rough and ready expression for a minimum or sensible amount of memory available to build the graph, given an input osm.pbf file of known size. For example I might say that if the osm.pbf is 500 MB than Java needs to have at least 1 GB memory available (i.e. twice the size) and the wizard will warn the user if less than 1 GB is available.

Does anyone have a idea what a good number would be? Say 2 or 3 times the input osm.pbf size?

This depends if you have CH enabled and if yes, how many vehicles and if with elevation etc. But roughly 1 or 1.5 the size of the pbf, yes.

1 Like

@karussell Thanks that’s great.

I’ve got a similar question to the one above. Do you have a rough idea of how the memory of the built graph (all files - edges, geometry, nodes, location_index etc…) on disk compares to the amount of RAM needed to hold it in memory when its totally loaded? I know you use quite compact data structures, so is it the case that the total file sizes of all files on disk is more-or-less equal to the size it takes up when loaded into RAM? Or is there some amount of ‘unpacking’ done when the graph is loaded into memory which could increase the size compare to on-disk?

It is identical, yes. Although you need a few more pointers to hold the arrays in memory this should be neglectable small in comparison.

Thanks for the answer!