graphhopper / graphhopper

Open source routing engine for OpenStreetMap. Use it as Java library or standalone web server.

Home Page:https://www.graphhopper.com/open-source/

Geek Repo:Geek Repo

Github PK Tool:Github PK Tool

Default MaxWeight resolution is insufficient for truck routing

otbutz opened this issue · comments

Describe the bug

This way has a maxweight tag with a value of 44: https://www.openstreetmap.org/way/4281359 (StreetView)

Our current default values limit the maximum storeable weight to 0.1 * 2^8 which is around ~25,6t:

/**
* Currently enables to store 0.1 to max=0.1*2⁸ tons and infinity. If a value is between the maximum and infinity
* it is assumed to use the maximum value. To save bits it might make more sense to store only a few values like
* it was done with the MappedDecimalEncodedValue still handling (or rounding) of unknown values is unclear.
*/

The problem is that we are limiting every value we find to this maximum. This causes the marked limit of 44 to be silently changed to 25.6 in the graph.

The result is quite surprising:

image

image

To Reproduce

Expected behavior

Either we should raise the default values to cover existing limits or we skip values between maximum and infinity.

Judging from the values on the taginfo website, we should probably increase the size of the encoded value to 9 bits:

https://taginfo.openstreetmap.org/keys/maxweight#values

A maximum value of 0.1 * 2^9 = 51,2 would cover the heaviest trucks in Europe and most of the USA.

The alternative approach of increasing the factor to 0.2, 0.3 or 0.5 causes us to lose precision for frequently used values. e.g:

Value / Factor 0.1 0.2 0.5
3.5 x x
2.8 x x
51.4 x x
6000lbs (~2.72) (x)

Is it really a good idea to silently cap values here?

} else if (value >= maxStorableValue * factor) { // equality is important as maxStorableValue is reserved for infinity
super.uncheckedSet(reverse, edgeId, edgeIntAccess, maxStorableValue - 1);
return;
}

I'd argue that we should treat "overflows" the same way as deliberately passing Double.POSITIVE_INFINITY:

if (useMaximumAsInfinity && value >= maxStorableValue * factor) {
    super.setInt(reverse, edgeId, edgeIntAccess, maxStorableValue);
    return;
}

This would also restore symmetry with DecimalEncodedValue.getNextStorableValue(double) which also returns Double.POSITIVE_INFINITY in this case.

See #2990

@karussell we can also stick with 8 bits and try to map the values as an enum instead. With 256 possible values, we should be able to cover all common limitations in the source material. The downside, of course, is that an enum is a single point of failure and would need to be evaluated from time to time.

try to map the values as an enum instead

But then it can no longer be (easily) used as a number in a custom model (?)

Agreed. Let's roll with 9bits 🙂

There is still the idea of a "Mapped"EncodedValue, i.e. instead of a factor it holds only distinct values. This could be interesting for a couple of EncodedValues as it could be still used as a number. The tricky thing could be the rounding.

The tricky thing could be the rounding.

I'd search the value with a configurable epsilon per EV.

What I meant is if you have the values and some with a bigger gap like: 3, 4, 10, 27. Then the value 20 should be rounded up to 27 but this would be IMO against the conservative approach and a vehicle of weight 22 would pass although it shouldn't. So, it might be necessary to implement different rounding strategies or something.

It's also tricky from a maintenance perspective. How do we know if we're missing a popular value? We would need to keep track of all the variants and how often they occur during import to be able to provide meaningful logging.

Yes, sounds tricky. We could collect this information directly in the encoded value and then after "graph.freeze" decide on how to best use the available bits. We could limit the values to 100 * available values and throw out less frequent values while the import to avoid a memory problem. But not sure how feasible this is.

private static final Map<Double,Integer> ENCOUNTERED_VALUES = new LinkedHashMap<>() {
    @Override
    protected boolean removeEldestEntry(Map.Entry<Double, Integer> eldest) {
        return size() > 10_000;
    }
};

// ...

ENCOUNTERED_VALUES.compute(value, (k, v) -> v == null ? 1 : v + 1);

Edit: If we want to only log offending ones