How to calculate in runtime the memory each IValue takes?
gkorland opened this issue · comments
We need to report in runtime the memory each IValue takes.
What is the best way to calculate its overhead, even ignoring the IString intern?
I tried to sum it calc it this way
fn size(v: &IValue) -> Result<usize, Error> {
let res = size_of::<IValue>()
+ match v.type_() {
ValueType::Null | ValueType::Bool | ValueType::Number => 0,
ValueType::String => v.as_string().unwrap().len(),
ValueType::Array => v
.as_array()
.unwrap()
.into_iter()
.map(|v| size(v).unwrap())
.sum(),
ValueType::Object => v
.as_object()
.unwrap()
.into_iter()
.map(|(s, v)| s.len() + size(v).unwrap())
.sum(),
};
Ok(res)
}
The amount of memory directly owned by an IValue depends on the type:
- Null (none)
- Boolean (none)
- Number (depends on the value)
- String (interned)
- Non-empty array ((capacity of array + 2) * pointer size)
- Non-empty object ((capacity of object * 3 + 2) * pointer size)
- Empty array/object (none)
(Here, "empty" refers to having capacity = 0, not the length)
For Number:
- If
has_decimal_point()
is true (16 bytes) - Else if in the range -128..=383 (none)
- Else if can be represented in 24 bits (4 bytes)
- Otherwise (16 bytes)
To calculate the full cost of an IValue you'd need to do this recursively.
There's no guarantee that these won't change in future.
OK, I see that I wrong...
Do you think we can add such function to the library so it will be future compatible?
Maybe if it's behind a feature flag? It's kindof niche, and it's not a very well-defined metric.
Else if in the range -128..=383 (none)
I think I get -128==i8.MIN
Can you explain the =383
?
-128 + 512 = 384
SInce the upper bound is exclusive, the maximum possible value is 383.
This range was chosen because the size of the range is a power of two and it covers all i8
or u8
values. I could have gone from -256..=255 instead, but I figured the positive values were more useful.
@Diggsey can you please review the changed in RedisJSON/RedisJSON#912