I have encountered the following issue using R both in Linux and Windows environments. In its simplest form, I have a 3 or 4-dimensional array, which I gradually fill using smaller arrays.
A <- array(NA, dim=c(500, 1000,1000))
B <- array(rnorm(1e4), dim=c(1000,1000))
for (i in 1:500) A[i,,] <- B
The interesting thing is that even though A is certainly allocated, when the loop starts, memory usage shoots up, to the point where the workstation becomes unusable. For context, execution of the third line can rapidly fill up 24GB of RAM, when A is 2000x2000x400.
Does anyone know why this happens, and whether there are ways to circumvent the issue?