I wanted to allocate a big char array in the data segment so I used some code like this one:
const int size = 1000000000 ;
static char chr [ size ] ;
int main ( )
{
chr [ size - 1 ] = 1 ; // line 1
string s ;
cin >> s ; // line 2
}
line 1 : I put this line so that the array is used at least once and it's not optimized-out from the compiler
line 2 : in order to stop the execution and check the memory occupation eg: in the windows' task manager
on a windows system the result is that when the program is stuck on line 2 waiting for the user's input, on task manager (in both columns of Memory and Working Set) the amount of memory used by the process is way less than the expected 1GB.
I then tried with the following code:
int main ( )
{
for ( int i = 0 ; i < size ; ++ i )
{
chr [ i ] = i ;
}
string s ;
cin >> s ; // line 2
}
Now, when the program reaches line 2 the memory use reaches the expected 1GB after a few seconds of quick growth.
It seems like the memory is dynamically allocated instead of statically.
Is my understanding of arrays / memory model wrong ?
Does the compiler allocate big amounts of data dynamically in order to optimize ?
Does the task manager show the physically allocated memory and so the 1GB is initially allocated on the hard drive till the first use ?