I have a large data in RAM, where for each user there are several unsigned short (2 bytes) values. Their amount is ranged from 0 to 256, but the real case is that I have 12M users with just 1 value, 6M with 2, 3.5M with 3 etc. up to 1 user with 60 values.
I need to choose a datatype that would be memory, access and modification-efficient. I wanted to use vector for such a purpose, but the size of the empty vector itself is 24 bytes. This is too much for my RAM.
Now I want to use std::array<>, because it doesn't allocate any additional bytes except the space for the data inside, i.e.
sizeof(std::array<unsigned short, 3>) // 6UL
sizeof(std::vector<unsigned short>) + sizeof(unsigned short) * v.capacity() // 24UL + 6UL
The problem is that std::array must specify the size on compile time. As you could guess, I can't do it. Is there a workaround? I store my users in hast_table<unsigned int, T>. Can I, for example, store arrays with different size for each key?
vectormemory consumption territory. You could allocate the maximum possible size for all, but that's also a huge waste. You're going to have to do something custom, I think.std::arrayis part of it's type, so you can't have different size arrays as part of a hash table. Have you consideredstd::unique_ptr<unsigned short[]>?