I think so, I will look for the practical one in the next piece though to make sure I get it. My sticking point is if you need to store N numbers in a set (or record that they are present in the set), how can the structure be always the same size without losing fidelity like the Bloom Filter does?

Ahhh now I see where the issue is. Please take a closer look at the definition of MO. It is not equal to the space consumed by the data structure but the ratio between space consumed and actual data being stored.

So if for each 32 bit number (base data) we store we also had to store a 32 bit pointer (auxiliary data), our MO = (32*n + 32*n) / (32*n) = (32 + 32) / 32 = 64 / 32 = 2.

Having said that it should be clear that in linear structures like lists or arrays the MO should always be constant and it also makes sense as for each element we add we have the same overhead in terms of space (e.g. a new pointer).

If you look at two dimensional data structures like trees you will see that the overhead grows with n. In a tree, adding more elements requires a growing amount of auxiliary data to be added.

As a rule of thumb you can take the asymptotic space requirement, e.g. O(n), and devide it by the number of elements in the data structure to get the asymptotic MO requirement, e.g. O(1).

Makes sense? Thanks for asking I think this part was not clear for many people, including me when I first read the paper.

Please note that in a log, i.e. the update optimal solution, we also have a linear structure but non-linear memory overhead. This is because our logical structure is a set and not a list and all the "outdated" entries in the log must be considered as they consume space although they are not really part of the logical set anymore. Do you know what I mean?

## re: RUM Conjecture - Reasoning About Data Access VIEW POST

TOP OF THREAD FULL DISCUSSIONGot it! Was my explanation of the deterministic

MO = 1structure understandable?I think so, I will look for the practical one in the next piece though to make sure I get it. My sticking point is if you need to store N numbers in a set (or record that they are present in the set), how can the structure be always the same size without losing fidelity like the Bloom Filter does?

Ahhh now I see where the issue is. Please take a closer look at the definition of

MO. It is not equal to the space consumed by the data structure but the ratio between space consumed and actual data being stored.So if for each 32 bit number (base data) we store we also had to store a 32 bit pointer (auxiliary data), our

MO = (32*n + 32*n) / (32*n) = (32 + 32) / 32 = 64 / 32 = 2.Having said that it should be clear that in linear structures like lists or arrays the

MOshould always be constant and it also makes sense as for each element we add we have the same overhead in terms of space (e.g. a new pointer).If you look at two dimensional data structures like trees you will see that the overhead grows with

n. In a tree, adding more elements requires a growing amount of auxiliary data to be added.As a rule of thumb you can take the asymptotic space requirement, e.g.

O(n), and devide it by the number of elements in the data structure to get the asymptoticMOrequirement, e.g.O(1).Makes sense? Thanks for asking I think this part was not clear for many people, including me when I first read the paper.

Please note that in a log, i.e. the update optimal solution, we also have a linear structure but non-linear memory overhead. This is because our logical structure is a set and not a list and all the "outdated" entries in the log must be considered as they consume space although they are not really part of the logical set anymore. Do you know what I mean?

Man, totally, I completely like... read the MO definition, grokked it, then promptly forgot to think about it! Thank you, Frank!