Thanks for taking the time to read this. I've been learning C for a few days and am stuck. I'm playing around with creating huge arrays (several GB) and cannot seem to create an array that's bigger than 2GB. Here is my code:
#include <stdio.h>
#include <stdlib.h>
#include <math.h>
/* Exploring 1d array sizes in c */
int main()
{
int i;
double mbs;
int arr_length = 260000000;
int arr_size = sizeof(double) * arr_length;
double *arr = malloc(arr_size);
for(i = 0; i < arr_length; i++)
arr[i] = (double)i;
/* Print array size */
arr_size = (double)arr_size;
mbs = (double)arr_size / pow(1024.0, 2);
printf("The size of the array is %1.1f megabytes. \n", mbs);
return 0;
}
When I run the code, I get a reasonable result:
:~/c-examples> gcc -o array-size array-size2.c
:~/c-examples> ./array-size
The size of the array is 1983.6 megabytes.
However, if I increase arr_length to 270000000 (270 million), I get a segmentation fault even though the array would be just over 2GB in size. I'm currently running 64 bit OpenSuse 13.1, and have 6GB of RAM:
:~/c-examples> free -h
total used free shared buffers cached
Mem: 5.6G 910M 4.7G 27M 12M 377M
-/+ buffers/cache: 520M 5.1G
Swap: 2.0G 307M 1.7G
I was hoping to eventually be able to store arrays of size 10-12GB (after adding more RAM), but want to make sure I understand exactly what's going on before. Thanks again for your time - and suggestions (an criticism!) is most welcome.
size_t sz; printf("%zu\n", sz);