trolando / sylvan

Implementation of multi-core (binary) decision diagrams

Geek Repo:Geek Repo

Github PK Tool:Github PK Tool

Memorycap in sylvan_set_limits

allrtaken opened this issue · comments

Hi,

I am running Sylvan on machines with at least 32GB RAM. I set the memory cap in the function sylvan_set_limits to 28GB (table ratio 3 and init ratio 5). However, Sylvan exits with the "BDD Unique table full" error. I checked the max resident set size (as reported by time command) and it was always at most ~15GB . In other words, I suspect Sylvan is not using the full available RAM. Is there a workaround?

Thanks,
Aditya

Sylvan always picks for the table and cache size the largest power of two that fits into the given amount of memory (that also satisfies the given cache-to-nodes ratio). So, yes indeed it is true that Sylvan does not make use of all the space you have available.

For reference, look here:
https://github.com/trolando/sylvan/blob/master/src/sylvan_common.c#L303

Whether there is a workaround is a good question. You can try to change the sylvan_set_limits function such that it is much more tight, but I am unsure whether that breaks any of the invariants in the remaining code. Maybe it only breaks table resizing in which case you may want to try it with an init ratio of 0 and hope the added cache misses/memory-initialisation do not suddenly introduce a major slowdown.

Thank you. For now I just increased the memory cap to ~32GB which made it slightly more than the nearest power of two, and now most of the RAM is getting used. However, it would be good to know if sylvan_set_limits can be made tighter without breaking anything else.