I am doing empirical research on the impact of wind turbine proximity on property values. I've built the spatial lag, error, and SAC models. My problem is that, after specifying the weights matrix, I cannot run the regression models. R gives me an error that "error cannot allocate vector of size e.g 70 Gb". There are a few things to mention:
My original data contains 1.4 million observations, however, it does not work with 100k observations as well.
For properties I have identical coordinates because the coordinates are the centroid of the grid where houses are located (there is more than one house in some grids, meaning that several houses share the common coordinates). I specified the weights matrix with the fixed distance
My computer has 32gb RAM
#
neighbors_nb <- dnearneigh(x=st_geometry(house_final), d1 = 0, d2 = 2000)
# Convert the neighbor list to a row-standardized weights list
weights_list <- nb2listw(neighbors_nb, style = "W" , zero.policy = TRUE)
# regression model
model_e <- errorsarlm(eq_spa, data = house_final,
listw = weights_list,
method = "eigen", zero.policy = TRUE)
My question is how can I deal with the error: cannot allocate a vector of size e.g. 70 Gb? if I can run the regression with 10k observations but not more than that, what could be the potential problem? matrix size? identical coordinates? or something else? Thank you very much in advance.