I'm using numpy to create a cube array with sides of length 100, thus containing 1 million entries total. For each of the million entries, I am inserting a 100x100 matrix whose entries are comprised of randomly generated numbers. I am using the following code to do so:
import random
from numpy import *
cube = arange(1000000).reshape(100,100,100)
for element in cube.flat:
matrix = arange(10000).reshape(100,100)
for entry in matrix.flat:
entry = random.random()*100
element = matrix
I was expecting this to take a while, but with 10 billion random numbers being generated, I'm not sure my computer can even handle it. How much memory would such an array take up? Would RAM be a limiting factor, i.e. if my computer doesn't have enough RAM, could it fail to actually generate the array?
Also, if there is a more efficient to implement this code, I would appreciate tips :)
doubleprecision, at 8 bytes each, if you really are trying to store 10 billion of them, that's 80GB. If you have to ask, your computer doesn't have enough memory. That said, it looks like you're creating them all but not storing them, so you should be fine.